Search results for: secure data aggregation
24759 The Study on Life of Valves Evaluation Based on Tests Data
Authors: Binjuan Xu, Qian Zhao, Ping Jiang, Bo Guo, Zhijun Cheng, Xiaoyue Wu
Abstract:
Astronautical valves are key units in engine systems of astronautical products; their reliability will influence results of rocket or missile launching, even lead to damage to staff and devices on the ground. Besides failure in engine system may influence the hitting accuracy and flight shot of missiles. Therefore high reliability is quite essential to astronautical products. There are quite a few literature doing research based on few failure test data to estimate valves’ reliability, thus this paper proposed a new method to estimate valves’ reliability, according to the corresponding tests of different failure modes, this paper takes advantage of tests data which acquired from temperature, vibration, and action tests to estimate reliability in every failure modes, then this paper has regarded these three kinds of tests as three stages in products’ process to integrate these results to acquire valves’ reliability. Through the comparison of results achieving from tests data and simulated data, the results have illustrated how to obtain valves’ reliability based on the few failure data with failure modes and prove that the results are effective and rational.Keywords: censored data, temperature tests, valves, vibration tests
Procedia PDF Downloads 34724758 Development of Energy Benchmarks Using Mandatory Energy and Emissions Reporting Data: Ontario Post-Secondary Residences
Authors: C. Xavier Mendieta, J. J McArthur
Abstract:
Governments are playing an increasingly active role in reducing carbon emissions, and a key strategy has been the introduction of mandatory energy disclosure policies. These policies have resulted in a significant amount of publicly available data, providing researchers with a unique opportunity to develop location-specific energy and carbon emission benchmarks from this data set, which can then be used to develop building archetypes and used to inform urban energy models. This study presents the development of such a benchmark using the public reporting data. The data from Ontario’s Ministry of Energy for Post-Secondary Educational Institutions are being used to develop a series of building archetype dynamic building loads and energy benchmarks to fill a gap in the currently available building database. This paper presents the development of a benchmark for college and university residences within ASHRAE climate zone 6 areas in Ontario using the mandatory disclosure energy and greenhouse gas emissions data. The methodology presented includes data cleaning, statistical analysis, and benchmark development, and lessons learned from this investigation are presented and discussed to inform the development of future energy benchmarks from this larger data set. The key findings from this initial benchmarking study are: (1) the importance of careful data screening and outlier identification to develop a valid dataset; (2) the key features used to develop a model of the data are building age, size, and occupancy schedules and these can be used to estimate energy consumption; and (3) policy changes affecting the primary energy generation significantly affected greenhouse gas emissions, and consideration of these factors was critical to evaluate the validity of the reported data.Keywords: building archetypes, data analysis, energy benchmarks, GHG emissions
Procedia PDF Downloads 30824757 Collision Detection Algorithm Based on Data Parallelism
Authors: Zhen Peng, Baifeng Wu
Abstract:
Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability
Procedia PDF Downloads 29124756 Changing Arbitrary Data Transmission Period by Using Bluetooth Module on Gas Sensor Node of Arduino Board
Authors: Hiesik Kim, Yong-Beom Kim, Jaheon Gu
Abstract:
Internet of Things (IoT) applications are widely serviced and spread worldwide. Local wireless data transmission technique must be developed to rate up with some technique. Bluetooth wireless data communication is wireless technique is technique made by Special Inter Group (SIG) using the frequency range 2.4 GHz, and it is exploiting Frequency Hopping to avoid collision with a different device. To implement experiment, equipment for experiment transmitting measured data is made by using Arduino as open source hardware, gas sensor, and Bluetooth module and algorithm controlling transmission rate is demonstrated. Experiment controlling transmission rate also is progressed by developing Android application receiving measured data, and controlling this rate is available at the experiment result. It is important that in the future, improvement for communication algorithm be needed because a few error occurs when data is transferred or received.Keywords: Arduino, Bluetooth, gas sensor, IoT, transmission
Procedia PDF Downloads 27924755 Real-Time Sensor Fusion for Mobile Robot Localization in an Oil and Gas Refinery
Authors: Adewole A. Ayoade, Marshall R. Sweatt, John P. H. Steele, Qi Han, Khaled Al-Wahedi, Hamad Karki, William A. Yearsley
Abstract:
Understanding the behavioral characteristics of sensors is a crucial step in fusing data from several sensors of different types. This paper introduces a practical, real-time approach to integrate heterogeneous sensor data to achieve higher accuracy than would be possible from any one individual sensor in localizing a mobile robot. We use this approach in both indoor and outdoor environments and it is especially appropriate for those environments like oil and gas refineries due to their sparse and featureless nature. We have studied the individual contribution of each sensor data to the overall combined accuracy achieved from the fusion process. A Sequential Update Extended Kalman Filter(EKF) using validation gates was used to integrate GPS data, Compass data, WiFi data, Inertial Measurement Unit(IMU) data, Vehicle Velocity, and pose estimates from Fiducial marker system. Results show that the approach can enable a mobile robot to navigate autonomously in any environment using a priori information.Keywords: inspection mobile robot, navigation, sensor fusion, sequential update extended Kalman filter
Procedia PDF Downloads 47324754 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities
Authors: Salman Naseer
Abstract:
One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission
Procedia PDF Downloads 14324753 Integrated Waste-to-Energy Approach: An Overview
Authors: Tsietsi J. Pilusa, Tumisang G. Seodigeng
Abstract:
This study evaluates the benefits of advanced waste management practices in unlocking waste-to-energy opportunities within the solid waste industry. The key drivers of sustainable waste management practices, specifically with respect to packaging waste-to-energy technology options are discussed. The success of a waste-to-energy system depends significantly on the appropriateness of available technologies, including those that are well established as well as those that are less so. There are hard and soft interventions to be considered when packaging an integrated waste treatment solution. Technology compatibility with variation in feedstock (waste) quality and quantities remains a key factor. These factors influence the technology reliability in terms of production efficiencies and product consistency, which in turn, drives the supply and demand network. Waste treatment technologies rely on the waste material as feedstock; the feedstock varies in quality and quantities depending on several factors; hence, the technology fails, as a result. It is critical to design an advanced waste treatment technology in an integrated approach to minimize the possibility of technology failure due to unpredictable feedstock quality, quantities, conversion efficiencies, and inconsistent product yield or quality. An integrated waste-to-energy approach offers a secure system design that considers sustainable waste management practices.Keywords: emerging markets, evaluation tool, interventions, waste treatment technologies
Procedia PDF Downloads 27424752 The Role of the Defense and Future War in Ukraine
Authors: Matthew J. Flynn
Abstract:
In early 2022, a thirty-mile column of Russian armor and assault vehicles sat poised to move south on the road to Kiev. That force has withdrawn as the Russians concentrate on attacking eastern Ukraine. Vladimir Putin’s armies appear content to destroy cities in an effort to attrit the Ukrainian will to continue fighting. That pivot signifies the acceptance of the ascendancy of the defense that now dictates any battlefield world-wide. To defeat what military theorist Carl von Clausewitz labeled “the stronger form of war” with a successful offensive requires an exercise in future war. In the past, the ascendancy of the defense has been overcome by a number of things including the application of superior leadership, better technology, organizational adaptation, and surpassing environmental limitations. A look at how each of these factors came to impact battle can tell us a great deal about what Ukraine means to tomorrow’s fight, and where the focus should lie to win the next war. Civilians presently secure the defensive ascendancy impacting warfare by dominating the shifts from domain to domain thanks to controlling access to cyberspace. That mandate will be tested and eventually falter. This paper tests the desirability of that proposition, as well as hoping for something more from humanity other than repeated and frequent wars making future war look much like past wars. As nations struggle to control cyberspace, a referendum on war as part of the human condition comes into focus.Keywords: cyber, domains, future war, Putin, Ukraine
Procedia PDF Downloads 11424751 Toxicological Effects of Heavy Metals; Copper, Lead and Chromium on Brain and Liver Tissue of Grass Carp (Ctenopharyngodon idella)
Authors: Ahsan Khan, Nazish Shah, Muhammad Salman
Abstract:
The present study deals with the toxicological effects of copper, lead and chromium on brain and liver tissues of grass carp (Ctenopharyngodon idella). The average length of experimental fish was 8.5 ± 5.5 cm and weighed 9.5 ± 6.5 g. Grass carp was exposed to lethal concentration (LC₁₅) of copper, lead and chromium for 24, 48, 72 and 96 hours respectively. (LC₁₅) for copper was 1.5, 1.4, 1.2 and 1mgL⁻¹. Similarly, LC₁₅ of lead was 250, 235, 225 and 216mgL⁻¹ while (LC₁₅) for chromium was 25.5, 22.5, 20 and 18mgL⁻¹ respectively. During the time of exposure against various doses of heavy metals the grass carp showed some behavioral changes. In the initial stages of experiment, the rapid movements and gulping of air were observed. Several times the fish tried to jump to scat from the toxic median. In addition, the accumulation of heavy metals in different tissues of grass carp particularly in liver and brain tissues were observed. Lead was highly accumulated in brain tissue after the exposure of fish for 24 and 48 hours, while highly accumulated in liver tissues after the exposure of fish for 72 and 96 hours. Chromium was highly accumulated in the liver tissues after the exposure of fish for 24 hours while its accumulation was found highly in the brain tissues after the exposure of fish for 48, 72 and 96 hours. Similarly, accumulation of copper concentration was found highly in brain tissues after the exposure of 48 and 96 hours while its accumulation was high in liver tissues after the exposure of 24 and 72 hours. Comparatively maximum accumulation of lead was found in brain and liver tissues of grass carp followed by chromium and copper. Furthermore, accumulation of these metals caused many abnormalities like gliosis, destruction of cell, change in cell shape and shrinkage of cells in brain tissue while in liver tissues aggregation in hepatocytes, widen space between cells and also destruction of cell was observed. These experiments and observations can be useful to monitor the aquatic pollution and quality of aquatic environment system.Keywords: brain, grass carp, liver, lethal concentration, toxicity
Procedia PDF Downloads 15624750 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm
Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy
Abstract:
IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.Keywords: IoT, fog networks, data stewardship, dynamic access policy
Procedia PDF Downloads 6024749 An Automated Approach to Consolidate Galileo System Availability
Authors: Marie Bieber, Fabrice Cosson, Olivier Schmitt
Abstract:
Europe's Global Navigation Satellite System, Galileo, provides worldwide positioning and navigation services. The satellites in space are only one part of the Galileo system. An extensive ground infrastructure is essential to oversee the satellites and ensure accurate navigation signals. High reliability and availability of the entire Galileo system are crucial to continuously provide positioning information of high quality to users. Outages are tracked, and operational availability is regularly assessed. A highly flexible and adaptive tool has been developed to automate the Galileo system availability analysis. Not only does it enable a quick availability consolidation, but it also provides first steps towards improving the data quality of maintenance tickets used for the analysis. This includes data import and data preparation, with a focus on processing strings used for classification and identifying faulty data. Furthermore, the tool allows to handle a low amount of data, which is a major constraint when the aim is to provide accurate statistics.Keywords: availability, data quality, system performance, Galileo, aerospace
Procedia PDF Downloads 16724748 A Computational Framework for Decoding Hierarchical Interlocking Structures with SL Blocks
Authors: Yuxi Liu, Boris Belousov, Mehrzad Esmaeili Charkhab, Oliver Tessmann
Abstract:
This paper presents a computational solution for designing reconfigurable interlocking structures that are fully assembled with SL Blocks. Formed by S-shaped and L-shaped tetracubes, SL Block is a specific type of interlocking puzzle. Analogous to molecular self-assembly, the aggregation of SL blocks will build a reversible hierarchical and discrete system where a single module can be numerously replicated to compose semi-interlocking components that further align, wrap, and braid around each other to form complex high-order aggregations. These aggregations can be disassembled and reassembled, responding dynamically to design inputs and changes with a unique capacity for reconfiguration. To use these aggregations as architectural structures, we developed computational tools that automate the configuration of SL blocks based on architectural design objectives. There are three critical phases in our work. First, we revisit the hierarchy of the SL block system and devise a top-down-type design strategy. From this, we propose two key questions: 1) How to translate 3D polyominoes into SL block assembly? 2) How to decompose the desired voxelized shapes into a set of 3D polyominoes with interlocking joints? These two questions can be considered the Hamiltonian path problem and the 3D polyomino tiling problem. Then, we derive our solution to each of them based on two methods. The first method is to construct the optimal closed path from an undirected graph built from the voxelized shape and translate the node sequence of the resulting path into the assembly sequence of SL blocks. The second approach describes interlocking relationships of 3D polyominoes as a joint connection graph. Lastly, we formulate the desired shapes and leverage our methods to achieve their reconfiguration within different levels. We show that our computational strategy will facilitate the efficient design of hierarchical interlocking structures with a self-replicating geometric module.Keywords: computational design, SL-blocks, 3D polyomino puzzle, combinatorial problem
Procedia PDF Downloads 13024747 Use of In-line Data Analytics and Empirical Model for Early Fault Detection
Authors: Hyun-Woo Cho
Abstract:
Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.Keywords: batch process, monitoring, measurement, kernel method
Procedia PDF Downloads 32324746 From Teaching Methods to Learning Styles: Toward Humanizing Education and Building Rapport with Students at Sultan Qaboos University
Authors: Mounir Ben Zid
Abstract:
The controversy over the most effective teaching method to facilitate the increase of a student's knowledge has remained a frustration for poetry teachers at Sultan Qaboos University in Oman for the last ten years. Scholars and educationists have pursued answers to this question, and tremendous effort has been marshalled to discover the optimum teaching strategy, with little success. The present study stems from this perpetual frustration among teachers of poetry and the dispute about the repertoire of teaching methods. It attempts to shed light on an alternative direction which, it is believed, has received less scholarly attention than deserved. It emphasizes the need to create a democratic and human atmosphere of learning, arouses students' genuine interest, provides students with aesthetic pleasure, and enable them to appreciate and enjoy the beauty and musicality of words in poems. More important, this teaching-learning style should aim to secure rapport with students, invite teachers to inspire the passion and love of poetry in their students and help them not to lose the sense of wonder and enthusiasm that should be in the forefront of enjoying poetry. Hence, it is the need of the time that, after they have an interest, feeling and desire for poetry, university students can move to heavier tasks and discussions about poetry and how to further understand and analyze what is being portrayed. It is timely that the pendulum swung in support of the humanization of education and building rapport with students at Sultan Qaboos University.Keywords: education, humanization, learning style, Rapport
Procedia PDF Downloads 24624745 The Impact of the General Data Protection Regulation on Human Resources Management in Schools
Authors: Alexandra Aslanidou
Abstract:
The General Data Protection Regulation (GDPR), concerning the protection of natural persons within the European Union with regard to the processing of personal data and on the free movement of such data, became applicable in the European Union (EU) on 25 May 2018 and transformed the way personal data were being treated under the Data Protection Directive (DPD) regime, generating sweeping organizational changes to both public sector and business. A social practice that is considerably influenced in the way of its day-to-day operations is Human Resource (HR) management, for which the importance of GDPR cannot be underestimated. That is because HR processes personal data coming in all shapes and sizes from many different systems and sources. The significance of the proper functioning of an HR department, specifically in human-centered, service-oriented environments such as the education field, is decisive due to the fact that HR operations in schools, conducted effectively, determine the quality of the provided services and consequently have a considerable impact on the success of the educational system. The purpose of this paper is to analyze the decisive role that GDPR plays in HR departments that operate in schools and in order to practically evaluate the aftermath of the Regulation during the first months of its applicability; a comparative use cases analysis in five highly dynamic schools, across three EU Member States, was attempted.Keywords: general data protection regulation, human resource management, educational system
Procedia PDF Downloads 10124744 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data
Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz
Abstract:
In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.Keywords: real-time spatial big data, quality of service, vertical partitioning, horizontal partitioning, matching algorithm, hamming distance, stream query
Procedia PDF Downloads 15824743 A Hybrid Data-Handler Module Based Approach for Prioritization in Quality Function Deployment
Authors: P. Venu, Joeju M. Issac
Abstract:
Quality Function Deployment (QFD) is a systematic technique that creates a platform where the customer responses can be positively converted to design attributes. The accuracy of a QFD process heavily depends on the data that it is handling which is captured from customers or QFD team members. Customized computer programs that perform Quality Function Deployment within a stipulated time have been used by various companies across the globe. These programs heavily rely on storage and retrieval of the data on a common database. This database must act as a perfect source with minimum missing values or error values in order perform actual prioritization. This paper introduces a missing/error data handler module which uses Genetic Algorithm and Fuzzy numbers. The prioritization of customer requirements of sesame oil is illustrated and a comparison is made between proposed data handler module-based deployment and manual deployment.Keywords: hybrid data handler, QFD, prioritization, module-based deployment
Procedia PDF Downloads 29724742 Early Childhood Education in a Depressed Economy in Nigeria: Implication in the Classroom
Authors: Ogunnaiya Racheal Taiwo
Abstract:
Children's formative years are crucial to their growth; it is, therefore, necessary for all the stakeholders to ensure that the pupils have an enabling quality of life which is essential for realizing their potential. For children to live and grow, they need a secure home, nutritious food, good health care, and quality education. This paper, therefore, investigates the implications of a depressed economy on the classroom learning of Nigerian children as it is clear that Nigeria is currently experiencing the worst economic depression in several decades, which affects a substantial proportion of children. The study is qualitative research, and it adopts a phenomenological approach where the experiences of respondents are examined qualitatively. Three senatorial districts in Oyo State were considered, and 50 teachers, both male, and female were chosen from each senatorial district for an interview through conversational key informants' interviews. The interviewees were recorded, transcribed, and presented using thematic analysis. Findings showed that more children have dropped out since the beginning of the year than in previous years. It was also recorded that learning has become challenging as children now find it harder to acquire learning materials. It was recommended that the government should reimburse early childhood schools to lessen the effect of the inability to purchase materials and pay school fees. It was also recommended that an intervention be made to approach and resolve issues associated with out-of-school children.Keywords: childhood, classroom, education, depressed economy, poverty
Procedia PDF Downloads 10824741 Predicting Groundwater Areas Using Data Mining Techniques: Groundwater in Jordan as Case Study
Authors: Faisal Aburub, Wael Hadi
Abstract:
Data mining is the process of extracting useful or hidden information from a large database. Extracted information can be used to discover relationships among features, where data objects are grouped according to logical relationships; or to predict unseen objects to one of the predefined groups. In this paper, we aim to investigate four well-known data mining algorithms in order to predict groundwater areas in Jordan. These algorithms are Support Vector Machines (SVMs), Naïve Bayes (NB), K-Nearest Neighbor (kNN) and Classification Based on Association Rule (CBA). The experimental results indicate that the SVMs algorithm outperformed other algorithms in terms of classification accuracy, precision and F1 evaluation measures using the datasets of groundwater areas that were collected from Jordanian Ministry of Water and Irrigation.Keywords: classification, data mining, evaluation measures, groundwater
Procedia PDF Downloads 28124740 Jurisdictional Issues between Competition Law and Data Protection Law in Protection of Privacy of Online Consumers
Authors: Pankhudi Khandelwal
Abstract:
The revenue models of digital giants such as Facebook and Google, use targeted advertising for revenues. Such a model requires huge amounts of consumer data. While the data protection law deals with the protection of personal data, however, this data is acquired by the companies on the basis of consent, performance of a contract, or legitimate interests. This paper analyses the role that competition law can play in evading these loopholes for the protection of data and privacy of online consumers. Digital markets have certain distinctive features such as network effects and feedback loop, which gives incumbents of these markets a first-mover advantage. This creates a situation where the winner takes it all, thus creating entry barriers and concentration in the market. It has been also seen that this dominant position is then used by the undertakings for leveraging in other markets. This can be harmful to the consumers in form of less privacy, less choice, and stifling innovation, as seen in the cases of Facebook Cambridge Analytica, Google Shopping, and Google Android. Therefore, the article aims to provide a legal framework wherein the data protection law and competition law can come together to provide a balance in regulating digital markets. The issue has become more relevant in light of the Facebook decision by German competition authority, where it was held that Facebook had abused its dominant position by not complying with data protection rules, which constituted an exploitative practice. The paper looks into the jurisdictional boundaries that the data protection and competition authorities can work from and suggests ex ante regulation through data protection law and ex post regulation through competition law. It further suggests a change in the consumer welfare standard where harm to privacy should be considered as an indicator of low quality.Keywords: data protection, dominance, ex ante regulation, ex post regulation
Procedia PDF Downloads 18424739 Application of Knowledge Discovery in Database Techniques in Cost Overruns of Construction Projects
Authors: Mai Ghazal, Ahmed Hammad
Abstract:
Cost overruns in construction projects are considered as worldwide challenges since the cost performance is one of the main measures of success along with schedule performance. To overcome this problem, studies were conducted to investigate the cost overruns' factors, also projects' historical data were analyzed to extract new and useful knowledge from it. This research is studying and analyzing the effect of some factors causing cost overruns using the historical data from completed construction projects. Then, using these factors to estimate the probability of cost overrun occurrence and predict its percentage for future projects. First, an intensive literature review was done to study all the factors that cause cost overrun in construction projects, then another review was done for previous researcher papers about mining process in dealing with cost overruns. Second, a proposed data warehouse was structured which can be used by organizations to store their future data in a well-organized way so it can be easily analyzed later. Third twelve quantitative factors which their data are frequently available at construction projects were selected to be the analyzed factors and suggested predictors for the proposed model.Keywords: construction management, construction projects, cost overrun, cost performance, data mining, data warehousing, knowledge discovery, knowledge management
Procedia PDF Downloads 37224738 Sampling Error and Its Implication for Capture Fisheries Management in Ghana
Authors: Temiloluwa J. Akinyemi, Denis W. Aheto, Wisdom Akpalu
Abstract:
Capture fisheries in developing countries provide significant animal protein and directly supports the livelihoods of several communities. However, the misperception of biophysical dynamics owing to a lack of adequate scientific data has contributed to the suboptimal management in marine capture fisheries. This is because yield and catch potentials are sensitive to the quality of catch and effort data. Yet, studies on fisheries data collection practices in developing countries are hard to find. This study investigates the data collection methods utilized by fisheries technical officers within the four fishing regions of Ghana. We found that the officers employed data collection and sampling procedures which were not consistent with the technical guidelines curated by FAO. For example, 50 instead of 166 landing sites were sampled, while 290 instead of 372 canoes were sampled. We argue that such sampling errors could result in the over-capitalization of capture fish stocks and significant losses in resource rents.Keywords: Fisheries data quality, fisheries management, Ghana, Sustainable Fisheries
Procedia PDF Downloads 9524737 From De Soto’s Solution to Urban Disaster: The Effects of Land Titling Policies on the Development of Cities of the Global South in the Case of Lima Peru
Authors: Jitka Molnarova
Abstract:
Based on De Soto’s idea that a formal land title can provide a secure home and access to credit to poor urban families, a large number of developing countries accepted the formalization of informal settlements as the ultimate solution for their housing crises and struggles with poverty. After two decades of implementation, very little is known about the effects this policy has on the quality of the neighborhoods it produces and on the development of cities in general. Using the capital of Peru -where the solution originated- as a case study, this paper illustrates the negative outcomes this policy has on urban development arguing that land titling encourages 1) expansion of the city often to areas of high physical risk, 2) production of precarious housing on unserviced land, and 3) practices of illegal land trafficking. The evidence is based on interviews with community leaders and officials working at the Cooperation for Formalization of Informal Property (COFOPRI), comparison of satellite images documenting the expansion of Lima in the past twenty years, and a technical evaluation of dozens of houses that have been or are in the process of being granted a land title.Keywords: COFOPRI, De Soto, housing policies, land titling, land trafficking, Lima, Peru, precarious housing, urban expansion
Procedia PDF Downloads 18724736 Improving the Performance of Road Salt on Anti-Icing
Authors: Mohsen Abotalebi Esfahani, Amin Rahimi
Abstract:
Maintenance and management of route and roads infrastructure is one of the most important and the most fundamental principles of the countries. Several methods have been under investigation as preventive proceedings for the maintenance of asphalt pavements for many years. Using a mixture of salt, sand and gravel is the most common method of deicing, which could have numerous harmful consequences. Icy or snow-covered road is one of the major reasons of accidents in rainy seasons, which causes substantial damages such as loss of time and energy, environmental pollution, destruction of buildings, traffic congestion and rising possibility of accidents. Regarding this, every year the government incurred enormous costs to secure traverses. In this study, asphalt pavements have been cured, in terms of compressive strength, tensile strength and resilient modulus of asphalt samples, under the influence of Magnesium Chloride, Calcium Chloride, Sodium Chloride, Urea and pure water; and showed that de-icing with the calcium chloride solution and urea have the minimum negative effect and de-icing with pure water has most negative effect on laboratory specimens. Hence some simple techniques and new equipment and less use of sand and salt, can reduce significantly the risks and harmful effects of excessive use of salt, sand and gravel and at the same time use the safer roads.Keywords: maintenance, sodium chloride, icyroad, calcium chloride
Procedia PDF Downloads 28524735 Improvement of Data Transfer over Simple Object Access Protocol (SOAP)
Authors: Khaled Ahmed Kadouh, Kamal Ali Albashiri
Abstract:
This paper presents a designed algorithm involves improvement of transferring data over Simple Object Access Protocol (SOAP). The aim of this work is to establish whether using SOAP in exchanging XML messages has any added advantages or not. The results showed that XML messages without SOAP take longer time and consume more memory, especially with binary data.Keywords: JAX-WS, SMTP, SOAP, web service, XML
Procedia PDF Downloads 49524734 Enhancing Healthcare Data Protection and Security
Authors: Joseph Udofia, Isaac Olufadewa
Abstract:
Everyday, the size of Electronic Health Records data keeps increasing as new patients visit health practitioner and returning patients fulfil their appointments. As these data grow, so is their susceptibility to cyber-attacks from criminals waiting to exploit this data. In the US, the damages for cyberattacks were estimated at $8 billion (2018), $11.5 billion (2019) and $20 billion (2021). These attacks usually involve the exposure of PII. Health data is considered PII, and its exposure carry significant impact. To this end, an enhancement of Health Policy and Standards in relation to data security, especially among patients and their clinical providers, is critical to ensure ethical practices, confidentiality, and trust in the healthcare system. As Clinical accelerators and applications that contain user data are used, it is expedient to have a review and revamp of policies like the Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), the Fast Healthcare Interoperability Resources (FHIR), all aimed to ensure data protection and security in healthcare. FHIR caters for healthcare data interoperability, FHIR caters to healthcare data interoperability, as data is being shared across different systems from customers to health insurance and care providers. The astronomical cost of implementation has deterred players in the space from ensuring compliance, leading to susceptibility to data exfiltration and data loss on the security accuracy of protected health information (PHI). Though HIPAA hones in on the security accuracy of protected health information (PHI) and PCI DSS on the security of payment card data, they intersect with the shared goal of protecting sensitive information in line with industry standards. With advancements in tech and the emergence of new technology, it is necessary to revamp these policies to address the complexity and ambiguity, cost barrier, and ever-increasing threats in cyberspace. Healthcare data in the wrong hands is a recipe for disaster, and we must enhance its protection and security to protect the mental health of the current and future generations.Keywords: cloud security, healthcare, cybersecurity, policy and standard
Procedia PDF Downloads 9324733 Channels Splitting Strategy for Optical Local Area Networks of Passive Star Topology
Authors: Peristera Baziana
Abstract:
In this paper, we present a network configuration for a WDM LANs of passive star topology that assume that the set of data WDM channels is split into two separate sets of channels, with different access rights over them. Especially, a synchronous transmission WDMA access algorithm is adopted in order to increase the probability of successful transmission over the data channels and consequently to reduce the probability of data packets transmission cancellation in order to avoid the data channels collisions. Thus, a control pre-transmission access scheme is followed over a separate control channel. An analytical Markovian model is studied and the average throughput is mathematically derived. The performance is studied for several numbers of data channels and various values of control phase duration.Keywords: access algorithm, channels division, collisions avoidance, wavelength division multiplexing
Procedia PDF Downloads 29724732 Analyzing Tools and Techniques for Classification In Educational Data Mining: A Survey
Authors: D. I. George Amalarethinam, A. Emima
Abstract:
Educational Data Mining (EDM) is one of the newest topics to emerge in recent years, and it is concerned with developing methods for analyzing various types of data gathered from the educational circle. EDM methods and techniques with machine learning algorithms are used to extract meaningful and usable information from huge databases. For scientists and researchers, realistic applications of Machine Learning in the EDM sectors offer new frontiers and present new problems. One of the most important research areas in EDM is predicting student success. The prediction algorithms and techniques must be developed to forecast students' performance, which aids the tutor, institution to boost the level of student’s performance. This paper examines various classification techniques in prediction methods and data mining tools used in EDM.Keywords: classification technique, data mining, EDM methods, prediction methods
Procedia PDF Downloads 11824731 Improving Security in Healthcare Applications Using Federated Learning System With Blockchain Technology
Authors: Aofan Liu, Qianqian Tan, Burra Venkata Durga Kumar
Abstract:
Data security is of the utmost importance in the healthcare area, as sensitive patient information is constantly sent around and analyzed by many different parties. The use of federated learning, which enables data to be evaluated locally on devices rather than being transferred to a central server, has emerged as a potential solution for protecting the privacy of user information. To protect against data breaches and unauthorized access, federated learning alone might not be adequate. In this context, the application of blockchain technology could provide the system extra protection. This study proposes a distributed federated learning system that is built on blockchain technology in order to enhance security in healthcare. This makes it possible for a wide variety of healthcare providers to work together on data analysis without raising concerns about the confidentiality of the data. The technical aspects of the system, including as the design and implementation of distributed learning algorithms, consensus mechanisms, and smart contracts, are also investigated as part of this process. The technique that was offered is a workable alternative that addresses concerns about the safety of healthcare while also fostering collaborative research and the interchange of data.Keywords: data privacy, distributed system, federated learning, machine learning
Procedia PDF Downloads 13624730 A Concept of Data Mining with XML Document
Authors: Akshay Agrawal, Anand K. Srivastava
Abstract:
The increasing amount of XML datasets available to casual users increases the necessity of investigating techniques to extract knowledge from these data. Data mining is widely applied in the database research area in order to extract frequent correlations of values from both structured and semi-structured datasets. The increasing availability of heterogeneous XML sources has raised a number of issues concerning how to represent and manage these semi structured data. In recent years due to the importance of managing these resources and extracting knowledge from them, lots of methods have been proposed in order to represent and cluster them in different ways.Keywords: XML, similarity measure, clustering, cluster quality, semantic clustering
Procedia PDF Downloads 385