Search results for: toxicity data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25538

Search results for: toxicity data

24668 Data Management System for Environmental Remediation

Authors: Elizaveta Petelina, Anton Sizo

Abstract:

Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.

Keywords: data management, environmental remediation, geographic information system, GIS, decision making

Procedia PDF Downloads 151
24667 An Efficient Approach for Speed up Non-Negative Matrix Factorization for High Dimensional Data

Authors: Bharat Singh Om Prakash Vyas

Abstract:

Now a day’s applications deal with High Dimensional Data have tremendously used in the popular areas. To tackle with such kind of data various approached has been developed by researchers in the last few decades. To tackle with such kind of data various approached has been developed by researchers in the last few decades. One of the problems with the NMF approaches, its randomized valued could not provide absolute optimization in limited iteration, but having local optimization. Due to this, we have proposed a new approach that considers the initial values of the decomposition to tackle the issues of computationally expensive. We have devised an algorithm for initializing the values of the decomposed matrix based on the PSO (Particle Swarm Optimization). Through the experimental result, we will show the proposed method converse very fast in comparison to other row rank approximation like simple NMF multiplicative, and ACLS techniques.

Keywords: ALS, NMF, high dimensional data, RMSE

Procedia PDF Downloads 337
24666 Integrating Time-Series and High-Spatial Remote Sensing Data Based on Multilevel Decision Fusion

Authors: Xudong Guan, Ainong Li, Gaohuan Liu, Chong Huang, Wei Zhao

Abstract:

Due to the low spatial resolution of MODIS data, the accuracy of small-area plaque extraction with a high degree of landscape fragmentation is greatly limited. To this end, the study combines Landsat data with higher spatial resolution and MODIS data with higher temporal resolution for decision-level fusion. Considering the importance of the land heterogeneity factor in the fusion process, it is superimposed with the weighting factor, which is to linearly weight the Landsat classification result and the MOIDS classification result. Three levels were used to complete the process of data fusion, that is the pixel of MODIS data, the pixel of Landsat data, and objects level that connect between these two levels. The multilevel decision fusion scheme was tested in two sites of the lower Mekong basin. We put forth a comparison test, and it was proved that the classification accuracy was improved compared with the single data source classification results in terms of the overall accuracy. The method was also compared with the two-level combination results and a weighted sum decision rule-based approach. The decision fusion scheme is extensible to other multi-resolution data decision fusion applications.

Keywords: image classification, decision fusion, multi-temporal, remote sensing

Procedia PDF Downloads 116
24665 Analysis of Cooperative Learning Behavior Based on the Data of Students' Movement

Authors: Wang Lin, Li Zhiqiang

Abstract:

The purpose of this paper is to analyze the cooperative learning behavior pattern based on the data of students' movement. The study firstly reviewed the cooperative learning theory and its research status, and briefly introduced the k-means clustering algorithm. Then, it used clustering algorithm and mathematical statistics theory to analyze the activity rhythm of individual student and groups in different functional areas, according to the movement data provided by 10 first-year graduate students. It also focused on the analysis of students' behavior in the learning area and explored the law of cooperative learning behavior. The research result showed that the cooperative learning behavior analysis method based on movement data proposed in this paper is feasible. From the results of data analysis, the characteristics of behavior of students and their cooperative learning behavior patterns could be found.

Keywords: behavior pattern, cooperative learning, data analyze, k-means clustering algorithm

Procedia PDF Downloads 180
24664 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 96
24663 A Security Cloud Storage Scheme Based Accountable Key-Policy Attribute-Based Encryption without Key Escrow

Authors: Ming Lun Wang, Yan Wang, Ning Ruo Sun

Abstract:

With the development of cloud computing, more and more users start to utilize the cloud storage service. However, there exist some issues: 1) cloud server steals the shared data, 2) sharers collude with the cloud server to steal the shared data, 3) cloud server tampers the shared data, 4) sharers and key generation center (KGC) conspire to steal the shared data. In this paper, we use advanced encryption standard (AES), hash algorithms, and accountable key-policy attribute-based encryption without key escrow (WOKE-AKP-ABE) to build a security cloud storage scheme. Moreover, the data are encrypted to protect the privacy. We use hash algorithms to prevent the cloud server from tampering the data uploaded to the cloud. Analysis results show that this scheme can resist conspired attacks.

Keywords: cloud storage security, sharing storage, attributes, Hash algorithm

Procedia PDF Downloads 386
24662 Geochemical Characteristics and Chemical Toxicity: Appraisal of Groundwater Uranium With Other Geogenic Contaminants in Various Districts of Punjab, India

Authors: Tanu Sharma, Bikramjit Singh Bajwa, Inderpreet Kaur

Abstract:

Monitoring of groundwater in Tarn-Taran, Bathinda, Faridkot and Mansa districts of Punjab state, India is essential where this freshwater resource is being over-exploited causing quality deterioration, groundwater depletion and posing serious threats to residents. The present integrated study was done to appraise quality and suitability of groundwater for drinking/irrigation purposes, hydro-geochemical characteristics, source identification and associated health risks. In the present study, groundwater of various districts of Punjab state was found to be heavily contaminated with As followed by U, thus posing high cancerous risks to local residents via ingestion, along with minor contamination of Fe, Mn, Pb and F−. Most health concerns in the study region were due to the elevated concentrations of arsenic in groundwater with average values of 130 µg L-1, 176 µg L-1, 272 µg L-1 and 651 µg L-1 in Tarn-Taran, Bathinda, Faridkot and Mansa districts, respectively, which is quite high as compared to the safe limit as recommended by BIS i.e. 10 µg L-1. In Tarn-Taran, Bathinda, Faridkot and Mansa districts, average uranium contents were found to be 37 µg L-1, 88 µg L-1, 61 µg L-1 and 104 µg L-1, with 51 %, 74 %, 61 % and 71 % samples, respectively, being above the WHO limit of 30 µg L-1 in groundwater. Further, the quality indices showed that groundwater of study region is suited for irrigation but not appropriate for drinking purposes. Hydro-geochemical studies revealed that most of the collected groundwater samples belonged to Ca2+ - Mg2+ - HCO3- type showing dominance of MgCO3 type which indicates the presence of temporary hardness in groundwater. Rock-water reactions and reverse ion exchange were the predominant factors for controlling hydro-geochemistry in the study region. Dissolution of silicate minerals caused the dominance of Na+ ions in the aquifers of study region. Multivariate statistics revealed that along with geogenic sources, contribution of anthropogenic activities such as injudicious application of agrochemicals and domestic waste discharge was also very significant. The results obtained abolished the myth that uranium is only root cause for large number of cancer patients in study region as arsenic and mercury were also present in groundwater at levels that were of health concern to groundwater.

Keywords: uranium, trace elements, multivariate data analysis, risk assessment

Procedia PDF Downloads 69
24661 Combinated Effect of Cadmium and Municipal Solid Waste Compost Addition on Physicochemical and Biochemical Proprieties of Soil and Lolium Perenne Production

Authors: Sonia Mbarki Marian Brestic, Artemio Cerda Naceur Jedidi, Jose Antonnio Pascual Chedly Abdelly

Abstract:

Monitoring the effect addition bio-amendment as compost to an agricultural soil for growing plant lolium perenne irrigated with a CdCl2 solution at 50 µM on physicochemical soils characteristics and plant production in laboratory condition. Even microbial activity indexes (acid phosphatase, β-glucosidase, urease, and dehydrogenase) was determined. Basal respiration was the most affected index, while enzymatic activities and microbial biomass showed a decrease due to the cadmium treatments. We noticed that this clay soil with higher pH showed inhibition of basal respiration. Our results provide evidence for the importance of ameliorating effect compost on plant growth even when soil was added with cadmium solution at 50 µmoml.l-1. Soil heavy metal concentrations depended on heavy metals types, increased substantially with cadmium increase and with compost addition, but the recorded values were below the toxicity limits in soils and plants except for cadmium.

Keywords: compost, enzymatic activity, lolium perenne, bioremediation

Procedia PDF Downloads 372
24660 The Study on Life of Valves Evaluation Based on Tests Data

Authors: Binjuan Xu, Qian Zhao, Ping Jiang, Bo Guo, Zhijun Cheng, Xiaoyue Wu

Abstract:

Astronautical valves are key units in engine systems of astronautical products; their reliability will influence results of rocket or missile launching, even lead to damage to staff and devices on the ground. Besides failure in engine system may influence the hitting accuracy and flight shot of missiles. Therefore high reliability is quite essential to astronautical products. There are quite a few literature doing research based on few failure test data to estimate valves’ reliability, thus this paper proposed a new method to estimate valves’ reliability, according to the corresponding tests of different failure modes, this paper takes advantage of tests data which acquired from temperature, vibration, and action tests to estimate reliability in every failure modes, then this paper has regarded these three kinds of tests as three stages in products’ process to integrate these results to acquire valves’ reliability. Through the comparison of results achieving from tests data and simulated data, the results have illustrated how to obtain valves’ reliability based on the few failure data with failure modes and prove that the results are effective and rational.

Keywords: censored data, temperature tests, valves, vibration tests

Procedia PDF Downloads 337
24659 Development of Energy Benchmarks Using Mandatory Energy and Emissions Reporting Data: Ontario Post-Secondary Residences

Authors: C. Xavier Mendieta, J. J McArthur

Abstract:

Governments are playing an increasingly active role in reducing carbon emissions, and a key strategy has been the introduction of mandatory energy disclosure policies. These policies have resulted in a significant amount of publicly available data, providing researchers with a unique opportunity to develop location-specific energy and carbon emission benchmarks from this data set, which can then be used to develop building archetypes and used to inform urban energy models. This study presents the development of such a benchmark using the public reporting data. The data from Ontario’s Ministry of Energy for Post-Secondary Educational Institutions are being used to develop a series of building archetype dynamic building loads and energy benchmarks to fill a gap in the currently available building database. This paper presents the development of a benchmark for college and university residences within ASHRAE climate zone 6 areas in Ontario using the mandatory disclosure energy and greenhouse gas emissions data. The methodology presented includes data cleaning, statistical analysis, and benchmark development, and lessons learned from this investigation are presented and discussed to inform the development of future energy benchmarks from this larger data set. The key findings from this initial benchmarking study are: (1) the importance of careful data screening and outlier identification to develop a valid dataset; (2) the key features used to develop a model of the data are building age, size, and occupancy schedules and these can be used to estimate energy consumption; and (3) policy changes affecting the primary energy generation significantly affected greenhouse gas emissions, and consideration of these factors was critical to evaluate the validity of the reported data.

Keywords: building archetypes, data analysis, energy benchmarks, GHG emissions

Procedia PDF Downloads 298
24658 Collision Detection Algorithm Based on Data Parallelism

Authors: Zhen Peng, Baifeng Wu

Abstract:

Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.

Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability

Procedia PDF Downloads 284
24657 Changing Arbitrary Data Transmission Period by Using Bluetooth Module on Gas Sensor Node of Arduino Board

Authors: Hiesik Kim, Yong-Beom Kim, Jaheon Gu

Abstract:

Internet of Things (IoT) applications are widely serviced and spread worldwide. Local wireless data transmission technique must be developed to rate up with some technique. Bluetooth wireless data communication is wireless technique is technique made by Special Inter Group (SIG) using the frequency range 2.4 GHz, and it is exploiting Frequency Hopping to avoid collision with a different device. To implement experiment, equipment for experiment transmitting measured data is made by using Arduino as open source hardware, gas sensor, and Bluetooth module and algorithm controlling transmission rate is demonstrated. Experiment controlling transmission rate also is progressed by developing Android application receiving measured data, and controlling this rate is available at the experiment result. It is important that in the future, improvement for communication algorithm be needed because a few error occurs when data is transferred or received.

Keywords: Arduino, Bluetooth, gas sensor, IoT, transmission

Procedia PDF Downloads 274
24656 Real-Time Sensor Fusion for Mobile Robot Localization in an Oil and Gas Refinery

Authors: Adewole A. Ayoade, Marshall R. Sweatt, John P. H. Steele, Qi Han, Khaled Al-Wahedi, Hamad Karki, William A. Yearsley

Abstract:

Understanding the behavioral characteristics of sensors is a crucial step in fusing data from several sensors of different types. This paper introduces a practical, real-time approach to integrate heterogeneous sensor data to achieve higher accuracy than would be possible from any one individual sensor in localizing a mobile robot. We use this approach in both indoor and outdoor environments and it is especially appropriate for those environments like oil and gas refineries due to their sparse and featureless nature. We have studied the individual contribution of each sensor data to the overall combined accuracy achieved from the fusion process. A Sequential Update Extended Kalman Filter(EKF) using validation gates was used to integrate GPS data, Compass data, WiFi data, Inertial Measurement Unit(IMU) data, Vehicle Velocity, and pose estimates from Fiducial marker system. Results show that the approach can enable a mobile robot to navigate autonomously in any environment using a priori information.

Keywords: inspection mobile robot, navigation, sensor fusion, sequential update extended Kalman filter

Procedia PDF Downloads 468
24655 Molecular Motors in Smart Drug Delivery Systems

Authors: Ainoa Guinart, Maria Korpidou, Daniel Doellerer, Cornelia Palivan, Ben L. Feringa

Abstract:

Stimuli responsive systems arise from the need to meet unsolved needs of current molecular drugs. Our study presents the design of a delivery system with high spatiotemporal control and tuneable release profiles. We study the incorporation of a hydrophobic synthetic molecular motor into PDMS-b-PMOXA block copolymer vesicles to create a self-assembled system. We prove their successful incorporation and selective activation by low powered visible light (λ 430 nm, 6.9 mW). We trigger the release of a fluorescent dye with high release efficiencies over sequential cycles (up to 75%) with the ability to turn on and off the release behaviour on demand by light irradiation. Low concentrations of photo-responsive units are proven to trigger release down to 1 mol% of molecular motor. Finally, we test our system in relevant physiological conditions using a lung cancer cell line and the encapsulation of an approved drug. Similar levels of cell viability are observed compared to the free-given drugshowing the potential of our platform to deliver functional drugs on demand with the same efficiency and lower toxicity.

Keywords: molecular motor, polymer, drug delivery, light-responsive, cancer, selfassembly

Procedia PDF Downloads 128
24654 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities

Authors: Salman Naseer

Abstract:

One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.

Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission

Procedia PDF Downloads 136
24653 Ethnobotanical Survey on the Use of Herbal Medicine at Children in Algeria

Authors: Metahri Leyla

Abstract:

Herbal medicine is one of the oldest medicines in the world. It constitutes an interesting alternative to treat and cure without creating new diseases. Despite the progress of medicine, the increase in the number of doctors, the creation of social security, many parents have resorted to herbal medicine for their children; they are increasingly asking for "natural remedies", "without risk" for their children. Herbal tea is a very accessible way to enjoy the benefits of herbal medicine. Accordingly; the objective of our study is to obtain detailed information on the composition and mode of administration of these herbal teas and to identify the different plants used; their beneficial effects, as well as their possible toxicity. The current research work represents an ethnobotanical survey spread over one month (from January 6, 2021, to February 19, 2021) carried out by means of an electronic questionnaire concerning 753 respondents involving single or multiparous mothers. The obtained results reveal that a total of 684 mothers used herbal teas for their infants, which revealed the use of 55 herbal remedies for several indications, the most sought after are the carminative effect and relief of colic, and which 9% of users noticed undesirable effects linked to the administration of herbal teas to their infants. As a conclusion, it has been asserted that the use of herbal teas as a natural remedy by Algerian mothers is a widely accepted practice, however, the "natural" nature of the plants does not mean that they are harmless.

Keywords: herbal medicine, herbal teas, children, mothers, medicinal plants

Procedia PDF Downloads 132
24652 Breast Cancer Cellular Immunotherapies

Authors: Zahra Shokrolahi, Mohammad Reza Atashzar

Abstract:

The goals of treating patients with breast cancer are to cure the disease, prolong survival, and improve quality of life. Immune cells in the tumor microenvironment have an important role in regulating tumor progression. The term of cellular immunotherapy refers to the administration of living cells to a patient; this type of immunotherapy can be active, such as a dendritic cell (DC) vaccine, in that the cells can stimulate an anti-tumour response in the patient, or the therapy can be passive, whereby the cells have intrinsic anti-tumour activity; this is known as adoptive cell transfer (ACT) and includes the use of autologous or allogeneic lymphocytes that may, or may not, be modified. The most important breast cancer cellular immunotherapies involving the use of T cells and natural killer (NK) cells in adoptive cell transfer, as well as dendritic cells vaccines. T cell-based therapies including tumour-infiltrating lymphocytes (TILs), engineered TCR-T cells, chimeric antigen receptor (CAR T cell), Gamma-delta (γδ) T cells, natural killer T (NKT) cells. NK cell-based therapies including lymphokine-activated killers (LAK), cytokine-induced killer (CIK) cells, CAR-NK cells. Adoptive cell therapy has some advantages and disadvantages some. TILs cell strictly directed against tumor-specific antigens but are inactive against tumor changes due to immunoediting. CIK cell have MHC-independent cytotoxic effect and also need concurrent high dose IL-2 administration. CAR T cell are MHC-independent; overcome tumor MHC molecule downregulation; potent in recognizing any cell surface antigen (protein, carbohydrate or glycolipid); applicable to a broad range of patients and T cell populations; production of large numbers of tumor-specific cells in a moderately short period of time. Meanwhile CAR T cells capable of targeting only cell surface antigens; lethal toxicity due to cytokine storm reported. Here we present the most popular cancer cellular immunotherapy approaches and discuss their clinical relevance referring to data acquired from clinical trials .To date, clinical experience and efficacy suggest that combining more than one immunotherapy interventions, in conjunction with other treatment options like chemotherapy, radiotherapy and targeted or epigenetic therapy, should guide the way to cancer cure.

Keywords: breast cancer , cell therapy , CAR T cell , CIK cells

Procedia PDF Downloads 123
24651 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 54
24650 Ecological Risk Aspects of Essential Trace Metals in Soil Derived From Gold Mining Region, South Africa

Authors: Lowanika Victor Tibane, David Mamba

Abstract:

Human body, animals, and plants depend on certain essential metals in permissible quantities for their survival. Excessive metal concentration may cause severe malfunctioning of the organisms and even fatal in extreme cases. Because of gold mining in the Witwatersrand basin in South Africa, enormous untreated mine dumps comprise elevated concentration of essential trace elements. Elevated quantities of trace metal have direct negative impact on the quality of soil for different land use types, reduce soil efficiency for plant growth, and affect the health human and animals. A total of 21 subsoil samples were examined using inductively coupled plasma optical emission spectrometry and X-ray fluorescence methods and the results elevated men concentration of Fe (36,433.39) > S (5,071.83) > Cu (1,717,28) > Mn (612.81) > Cr (74.52) > Zn (68.67) > Ni (40.44) > Co (9.63) > P (3.49) > Mo > (2.74), reported in mg/kg. Using various contamination indices, it was discovered that the sites surveyed are on average moderately contaminated with Co, Cr, Cu, Mn, Ni, S, and Zn. The ecological risk assessment revealed a low ecological risk for Cr, Ni and Zn, whereas Cu poses a very high ecological risk.

Keywords: essential trace elements, soil contamination, contamination indices, toxicity, descriptive statistics, ecological risk evaluation

Procedia PDF Downloads 86
24649 An Automated Approach to Consolidate Galileo System Availability

Authors: Marie Bieber, Fabrice Cosson, Olivier Schmitt

Abstract:

Europe's Global Navigation Satellite System, Galileo, provides worldwide positioning and navigation services. The satellites in space are only one part of the Galileo system. An extensive ground infrastructure is essential to oversee the satellites and ensure accurate navigation signals. High reliability and availability of the entire Galileo system are crucial to continuously provide positioning information of high quality to users. Outages are tracked, and operational availability is regularly assessed. A highly flexible and adaptive tool has been developed to automate the Galileo system availability analysis. Not only does it enable a quick availability consolidation, but it also provides first steps towards improving the data quality of maintenance tickets used for the analysis. This includes data import and data preparation, with a focus on processing strings used for classification and identifying faulty data. Furthermore, the tool allows to handle a low amount of data, which is a major constraint when the aim is to provide accurate statistics.

Keywords: availability, data quality, system performance, Galileo, aerospace

Procedia PDF Downloads 157
24648 Use of In-line Data Analytics and Empirical Model for Early Fault Detection

Authors: Hyun-Woo Cho

Abstract:

Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: batch process, monitoring, measurement, kernel method

Procedia PDF Downloads 318
24647 Enhanced Decolourization and Biodegradation of Textile Azo and Xanthene Dyes by Using Bacterial Isolates

Authors: Gimhani Madhushika Hewayalage, Thilini Ariyadasa, Sanja Gunawardena

Abstract:

In Sri Lanka, the largest contribution for the industrial export earnings is governed by textile and apparel industry. However, this industry generates huge quantities of effluent consists of unfixed dyes which enhance the effluent colour and toxicity thereby leading towards environmental pollution. Therefore, the effluent should properly be treated prior to the release into the environment. The biological technique has now captured much attention as an environmental-friendly and cost-competitive effluent decolourization method due to the drawbacks of physical and chemical treatment techniques. The present study has focused on identifying dye decolourizing potential of several bacterial isolates obtained from the effluent of the local textile industry. Yellow EXF, Red EXF, Blue EXF, Nova Black WNN and Nylosan-Rhodamine-EB dyes have been selected for the study to represent different chromophore groups such as Azo and Xanthene. The rates of decolorization of each dye have been investigated by employing distinct bacterial isolates. Bacterial isolate which exhibited effective dye decolorizing potential was identified as Proteus mirabilis using 16S rRNA gene sequencing analysis. The high decolorizing rates of identified bacterial strain indicate its potential applicability in the treatment of dye-containing wastewaters.

Keywords: azo, bacterial, biological, decolourization, xanthene

Procedia PDF Downloads 248
24646 Dissipation of Tebuconazole in Cropland Soils as Affected by Soil Factors

Authors: Bipul Behari Saha, Sunil Kumar Singh, P. Padmaja, Kamlesh Vishwakarma

Abstract:

Dissipation study of tebuconazole in alluvial, black and deep-black clayey soils collected from paddy, mango and peanut cropland of tropical agro-climatic zone of India at three concentration levels were carried out for monitoring the water contamination through persisted residual toxicity. The soil-slurry samples were analyzed by capillary GC-NPD methods followed by ultrasound-assisted extraction (UAE) technique and cleanup process. An excellent linear relationship between peak area and concentration obtained in the range 1 to 50 μgkg-1. The detection (S/N, 3 ± 0.5) and quantification (S/N, 7.5 ± 2.5) limits were 3 and 10 μgkg-1 respectively. Well spiked recoveries were achieved from 96.28 to 99.33 % at levels 5 and 20 μgkg-1 and method precision (% RSD) was ≤ 5%. The soils dissipation of tebuconazole was fitted in first order kinetic-model with half-life between 34.48 to 48.13 days. The soil organic-carbon (SOC) content correlated well with the dissipation rate constants (DRC) of the fungicide Tebuconazole. An increase in the SOC content resulted in faster dissipation. The results indicate that the soil organic carbon and tebuconazole concentrations plays dominant role in dissipation processes. The initial concentration illustrated that the degradation rate of tebuconazole in soils was concentration dependent.

Keywords: cropland soil, dissipation, laboratory incubation, tebuconazole

Procedia PDF Downloads 248
24645 The Impact of the General Data Protection Regulation on Human Resources Management in Schools

Authors: Alexandra Aslanidou

Abstract:

The General Data Protection Regulation (GDPR), concerning the protection of natural persons within the European Union with regard to the processing of personal data and on the free movement of such data, became applicable in the European Union (EU) on 25 May 2018 and transformed the way personal data were being treated under the Data Protection Directive (DPD) regime, generating sweeping organizational changes to both public sector and business. A social practice that is considerably influenced in the way of its day-to-day operations is Human Resource (HR) management, for which the importance of GDPR cannot be underestimated. That is because HR processes personal data coming in all shapes and sizes from many different systems and sources. The significance of the proper functioning of an HR department, specifically in human-centered, service-oriented environments such as the education field, is decisive due to the fact that HR operations in schools, conducted effectively, determine the quality of the provided services and consequently have a considerable impact on the success of the educational system. The purpose of this paper is to analyze the decisive role that GDPR plays in HR departments that operate in schools and in order to practically evaluate the aftermath of the Regulation during the first months of its applicability; a comparative use cases analysis in five highly dynamic schools, across three EU Member States, was attempted.

Keywords: general data protection regulation, human resource management, educational system

Procedia PDF Downloads 97
24644 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data

Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.

Keywords: real-time spatial big data, quality of service, vertical partitioning, horizontal partitioning, matching algorithm, hamming distance, stream query

Procedia PDF Downloads 153
24643 A Hybrid Data-Handler Module Based Approach for Prioritization in Quality Function Deployment

Authors: P. Venu, Joeju M. Issac

Abstract:

Quality Function Deployment (QFD) is a systematic technique that creates a platform where the customer responses can be positively converted to design attributes. The accuracy of a QFD process heavily depends on the data that it is handling which is captured from customers or QFD team members. Customized computer programs that perform Quality Function Deployment within a stipulated time have been used by various companies across the globe. These programs heavily rely on storage and retrieval of the data on a common database. This database must act as a perfect source with minimum missing values or error values in order perform actual prioritization. This paper introduces a missing/error data handler module which uses Genetic Algorithm and Fuzzy numbers. The prioritization of customer requirements of sesame oil is illustrated and a comparison is made between proposed data handler module-based deployment and manual deployment.

Keywords: hybrid data handler, QFD, prioritization, module-based deployment

Procedia PDF Downloads 290
24642 Predicting Groundwater Areas Using Data Mining Techniques: Groundwater in Jordan as Case Study

Authors: Faisal Aburub, Wael Hadi

Abstract:

Data mining is the process of extracting useful or hidden information from a large database. Extracted information can be used to discover relationships among features, where data objects are grouped according to logical relationships; or to predict unseen objects to one of the predefined groups. In this paper, we aim to investigate four well-known data mining algorithms in order to predict groundwater areas in Jordan. These algorithms are Support Vector Machines (SVMs), Naïve Bayes (NB), K-Nearest Neighbor (kNN) and Classification Based on Association Rule (CBA). The experimental results indicate that the SVMs algorithm outperformed other algorithms in terms of classification accuracy, precision and F1 evaluation measures using the datasets of groundwater areas that were collected from Jordanian Ministry of Water and Irrigation.

Keywords: classification, data mining, evaluation measures, groundwater

Procedia PDF Downloads 273
24641 Toxicity Depletion Rates of Water Lettuce (Pistia stratoites) in an Aquaculture Effluent Hydroponic System

Authors: E. A. Kiridi, A. O. Ogunlela

Abstract:

The control of ammonia build-up and its by-product is a limiting factor for a successful commercial aquaculture in a developing country like Nigeria. The technology for an advanced treatment of fish tank effluent is uneconomical to local fish farmers which have led to indiscriminate disposal of aquaculture wastewater, thereby increasing the concentrations of these nitrogenous compound and other contaminants in surface and groundwater above the permissible level. Phytoremediation using water lettuce could offer cheaper and sustainable alternative. On the first day of experimentation, approximately 100 g of water lettuce were replicated in four hydroponic units containing aquaculture effluents. The water quality parameters measured were concentration of ammonium–nitrogen (NH4+-N), nitrite-nitrogen (NO2--N), nitrate-nitrogen (NO3--N), and phosphate–phosphorus (PO43--P). Others were total suspended solids (TSS), pH, electrical conductivity (EC), and biomass value. At phytoremediation intervals of 7, 14, 21 and 28 days, the biomass recorded were 361.2 g, 498.7 g, 561.2 g, and 623.7 g. Water lettuce was able to reduce the pollutant concentration of all the selected parameter. The percentage reduction of pH ranged from 3.9% to 14.4%, EC from 49.8% to 96.2%, TDS from 50.4% to 96.2%, TSS from 38.3% to 81.7%, NH4+-N from 38.9% to 90.7%, NO2--N from 0% to 74.9%, NO3--N from 63.2% to 95.9% and PO43--P from 0% to 76.3%. At 95% confidence level, the analysis of variance shows that F(critical) is less than F(cal) and p < 0.05; therefore, it can be concluded statistically that the inequality between the pre-treatment and post-treatment values are significant. This suggests the potency of water lettuce for remediation of aquaculture effluent.

Keywords: aquaculture effluent, nitrification, phytoremediation, water lettuce

Procedia PDF Downloads 206
24640 Jurisdictional Issues between Competition Law and Data Protection Law in Protection of Privacy of Online Consumers

Authors: Pankhudi Khandelwal

Abstract:

The revenue models of digital giants such as Facebook and Google, use targeted advertising for revenues. Such a model requires huge amounts of consumer data. While the data protection law deals with the protection of personal data, however, this data is acquired by the companies on the basis of consent, performance of a contract, or legitimate interests. This paper analyses the role that competition law can play in evading these loopholes for the protection of data and privacy of online consumers. Digital markets have certain distinctive features such as network effects and feedback loop, which gives incumbents of these markets a first-mover advantage. This creates a situation where the winner takes it all, thus creating entry barriers and concentration in the market. It has been also seen that this dominant position is then used by the undertakings for leveraging in other markets. This can be harmful to the consumers in form of less privacy, less choice, and stifling innovation, as seen in the cases of Facebook Cambridge Analytica, Google Shopping, and Google Android. Therefore, the article aims to provide a legal framework wherein the data protection law and competition law can come together to provide a balance in regulating digital markets. The issue has become more relevant in light of the Facebook decision by German competition authority, where it was held that Facebook had abused its dominant position by not complying with data protection rules, which constituted an exploitative practice. The paper looks into the jurisdictional boundaries that the data protection and competition authorities can work from and suggests ex ante regulation through data protection law and ex post regulation through competition law. It further suggests a change in the consumer welfare standard where harm to privacy should be considered as an indicator of low quality.

Keywords: data protection, dominance, ex ante regulation, ex post regulation

Procedia PDF Downloads 169
24639 Application of Knowledge Discovery in Database Techniques in Cost Overruns of Construction Projects

Authors: Mai Ghazal, Ahmed Hammad

Abstract:

Cost overruns in construction projects are considered as worldwide challenges since the cost performance is one of the main measures of success along with schedule performance. To overcome this problem, studies were conducted to investigate the cost overruns' factors, also projects' historical data were analyzed to extract new and useful knowledge from it. This research is studying and analyzing the effect of some factors causing cost overruns using the historical data from completed construction projects. Then, using these factors to estimate the probability of cost overrun occurrence and predict its percentage for future projects. First, an intensive literature review was done to study all the factors that cause cost overrun in construction projects, then another review was done for previous researcher papers about mining process in dealing with cost overruns. Second, a proposed data warehouse was structured which can be used by organizations to store their future data in a well-organized way so it can be easily analyzed later. Third twelve quantitative factors which their data are frequently available at construction projects were selected to be the analyzed factors and suggested predictors for the proposed model.

Keywords: construction management, construction projects, cost overrun, cost performance, data mining, data warehousing, knowledge discovery, knowledge management

Procedia PDF Downloads 365