Search results for: data sensitivity
25370 Prediction of Marine Ecosystem Changes Based on the Integrated Analysis of Multivariate Data Sets
Authors: Prozorkevitch D., Mishurov A., Sokolov K., Karsakov L., Pestrikova L.
Abstract:
The current body of knowledge about the marine environment and the dynamics of marine ecosystems includes a huge amount of heterogeneous data collected over decades. It generally includes a wide range of hydrological, biological and fishery data. Marine researchers collect these data and analyze how and why the ecosystem changes from past to present. Based on these historical records and linkages between the processes it is possible to predict future changes. Multivariate analysis of trends and their interconnection in the marine ecosystem may be used as an instrument for predicting further ecosystem evolution. A wide range of information about the components of the marine ecosystem for more than 50 years needs to be used to investigate how these arrays can help to predict the future.Keywords: barents sea ecosystem, abiotic, biotic, data sets, trends, prediction
Procedia PDF Downloads 11625369 Optical Fiber Data Throughput in a Quantum Communication System
Authors: Arash Kosari, Ali Araghi
Abstract:
A mathematical model for an optical-fiber communication channel is developed which results in an expression that calculates the throughput and loss of the corresponding link. The data are assumed to be transmitted by using of separate photons with different polarizations. The derived model also shows the dependency of data throughput with length of the channel and depolarization factor. It is observed that absorption of photons affects the throughput in a more intensive way in comparison with that of depolarization. Apart from that, the probability of depolarization and the absorption of radiated photons are obtained.Keywords: absorption, data throughput, depolarization, optical fiber
Procedia PDF Downloads 28525368 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network
Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi
Abstract:
Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication
Procedia PDF Downloads 45025367 Offshore Outsourcing: Global Data Privacy Controls and International Compliance Issues
Authors: Michelle J. Miller
Abstract:
In recent year, there has been a rise of two emerging issues that impact the global employment and business market that the legal community must review closer: offshore outsourcing and data privacy. These two issues intersect because employment opportunities are shifting due to offshore outsourcing and some States, like the United States, anti-outsourcing legislation has been passed or presented to retain jobs within the country. In addition, the legal requirements to retain the privacy of data as a global employer extends to employees and third party service provides, including services outsourced to offshore locations. For this reason, this paper will review the intersection of these two issues with a specific focus on data privacy.Keywords: outsourcing, data privacy, international compliance, multinational corporations
Procedia PDF Downloads 41125366 Weighted Data Replication Strategy for Data Grid Considering Economic Approach
Authors: N. Mansouri, A. Asadi
Abstract:
Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.Keywords: data grid, data replication, simulation, replica selection, replica placement
Procedia PDF Downloads 26025365 Evaluation of Satellite and Radar Rainfall Product over Seyhan Plain
Authors: Kazım Kaba, Erdem Erdi, M. Akif Erdoğan, H. Mustafa Kandırmaz
Abstract:
Rainfall is crucial data source for very different discipline such as agriculture, hydrology and climate. Therefore rain rate should be known well both spatial and temporal for any area. Rainfall is measured by using rain-gauge at meteorological ground stations traditionally for many years. At the present time, rainfall products are acquired from radar and satellite images with a temporal and spatial continuity. In this study, we investigated the accuracy of these rainfall data according to rain-gauge data. For this purpose, we used Adana-Hatay radar hourly total precipitation product (RN1) and Meteosat convective rainfall rate (CRR) product over Seyhan plain. We calculated daily rainfall values from RN1 and CRR hourly precipitation products. We used the data of rainy days of four stations located within range of the radar from October 2013 to November 2015. In the study, we examined two rainfall data over Seyhan plain and the correlation between the rain-gauge data and two raster rainfall data was observed lowly.Keywords: meteosat, radar, rainfall, rain-gauge, Turkey
Procedia PDF Downloads 32825364 Spatial Data Mining by Decision Trees
Authors: Sihem Oujdi, Hafida Belbachir
Abstract:
Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining
Procedia PDF Downloads 61225363 Data-Driven Dynamic Overbooking Model for Tour Operators
Authors: Kannapha Amaruchkul
Abstract:
We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator
Procedia PDF Downloads 13425362 Modeling and Statistical Analysis of a Soap Production Mix in Bejoy Manufacturing Industry, Anambra State, Nigeria
Authors: Okolie Chukwulozie Paul, Iwenofu Chinwe Onyedika, Sinebe Jude Ebieladoh, M. C. Nwosu
Abstract:
The research work is based on the statistical analysis of the processing data. The essence is to analyze the data statistically and to generate a design model for the production mix of soap manufacturing products in Bejoy manufacturing company Nkpologwu, Aguata Local Government Area, Anambra state, Nigeria. The statistical analysis shows the statistical analysis and the correlation of the data. T test, Partial correlation and bi-variate correlation were used to understand what the data portrays. The design model developed was used to model the data production yield and the correlation of the variables show that the R2 is 98.7%. However, the results confirm that the data is fit for further analysis and modeling. This was proved by the correlation and the R-squared.Keywords: General Linear Model, correlation, variables, pearson, significance, T-test, soap, production mix and statistic
Procedia PDF Downloads 44525361 Helping the Development of Public Policies with Knowledge of Criminal Data
Authors: Diego De Castro Rodrigues, Marcelo B. Nery, Sergio Adorno
Abstract:
The project aims to develop a framework for social data analysis, particularly by mobilizing criminal records and applying descriptive computational techniques, such as associative algorithms and extraction of tree decision rules, among others. The methods and instruments discussed in this work will enable the discovery of patterns, providing a guided means to identify similarities between recurring situations in the social sphere using descriptive techniques and data visualization. The study area has been defined as the city of São Paulo, with the structuring of social data as the central idea, with a particular focus on the quality of the information. Given this, a set of tools will be validated, including the use of a database and tools for visualizing the results. Among the main deliverables related to products and the development of articles are the discoveries made during the research phase. The effectiveness and utility of the results will depend on studies involving real data, validated both by domain experts and by identifying and comparing the patterns found in this study with other phenomena described in the literature. The intention is to contribute to evidence-based understanding and decision-making in the social field.Keywords: social data analysis, criminal records, computational techniques, data mining, big data
Procedia PDF Downloads 8425360 Optimization of Real Time Measured Data Transmission, Given the Amount of Data Transmitted
Authors: Michal Kopcek, Tomas Skulavik, Michal Kebisek, Gabriela Krizanova
Abstract:
The operation of nuclear power plants involves continuous monitoring of the environment in their area. This monitoring is performed using a complex data acquisition system, which collects status information about the system itself and values of many important physical variables e.g. temperature, humidity, dose rate etc. This paper describes a proposal and optimization of communication that takes place in teledosimetric system between the central control server responsible for the data processing and storing and the decentralized measuring stations, which are measuring the physical variables. Analyzes of ongoing communication were performed and consequently the optimization of the system architecture and communication was done.Keywords: communication protocol, transmission optimization, data acquisition, system architecture
Procedia PDF Downloads 51825359 Nanoparticles Using in Chiral Analysis with Different Methods of Separation
Authors: Bounoua Nadia, Rebizi Mohamed Nadjib
Abstract:
Chiral molecules in relation to particular biological roles are stereoselective. Enantiomers differ significantly in their biochemical responses in a biological environment. Despite the current advancement in drug discovery and pharmaceutical biotechnology, the chiral separation of some racemic mixtures continues to be one of the greatest challenges because the available techniques are too costly and time-consuming for the assessment of therapeutic drugs in the early stages of development worldwide. Various nanoparticles became one of the most investigated and explored nanotechnology-derived nanostructures, especially in chirality, where several studies are reported to improve the enantiomeric separation of different racemic mixtures. The production of surface-modified nanoparticles has contributed to these limitations in terms of sensitivity, accuracy, and enantioselectivity that can be optimized and therefore makes these surface-modified nanoparticles convenient for enantiomeric identification and separation.Keywords: chirality, enantiomeric recognition, selectors, analysis, surface-modified nanoparticles
Procedia PDF Downloads 9425358 The Duty of Application and Connection Providers Regarding the Supply of Internet Protocol by Court Order in Brazil to Determine Authorship of Acts Practiced on the Internet
Authors: João Pedro Albino, Ana Cláudia Pires Ferreira de Lima
Abstract:
Humanity has undergone a transformation from the physical to the virtual world, generating an enormous amount of data on the world wide web, known as big data. Many facts that occur in the physical world or in the digital world are proven through records made on the internet, such as digital photographs, posts on social media, contract acceptances by digital platforms, email, banking, and messaging applications, among others. These data recorded on the internet have been used as evidence in judicial proceedings. The identification of internet users is essential for the security of legal relationships. This research was carried out on scientific articles and materials from courses and lectures, with an analysis of Brazilian legislation and some judicial decisions on the request of static data from logs and Internet Protocols (IPs) from application and connection providers. In this article, we will address the determination of authorship of data processing on the internet by obtaining the IP address and the appropriate judicial procedure for this purpose under Brazilian law.Keywords: IP address, digital forensics, big data, data analytics, information and communication technology
Procedia PDF Downloads 12425357 Sourcing and Compiling a Maltese Traffic Dataset MalTra
Authors: Gabriele Borg, Alexei De Bono, Charlie Abela
Abstract:
There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale.Keywords: Big Data, vehicular traffic, traffic management, mobile data patterns
Procedia PDF Downloads 10925356 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study
Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar
Abstract:
Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices
Procedia PDF Downloads 50725355 Metal-Organic Frameworks-Based Materials for Volatile Organic Compounds Sensing Applications: Strategies to Improve Sensing Performances
Authors: Claudio Clemente, Valentina Gargiulo, Alessio Occhicone, Giovanni Piero Pepe, Giovanni Ausanio, Michela Alfè
Abstract:
Volatile organic compound (VOC) emissions represent a serious risk to human health and the integrity of the ecosystems, especially at high concentrations. For this reason, it is very important to continuously monitor environmental quality and develop fast and reliable portable sensors to allow analysis on site. Chemiresistors have become promising candidates for VOC sensing as their ease of fabrication, variety of suitable sensitive materials, and simple sensing data. A chemoresistive gas sensor is a transducer that allows to measure the concentration of an analyte in the gas phase because the changes in resistance are proportional to the amount of the analyte present. The selection of the sensitive material, which interacts with the target analyte, is very important for the sensor performance. The most used VOC detection materials are metal oxides (MOx) for their rapid recovery, high sensitivity to various gas molecules, easy fabrication. Their sensing performance can be improved in terms of operating temperature, selectivity, and detection limit. Metal-organic frameworks (MOFs) have attracted a lot of attention also in the field of gas sensing due to their high porosity, high surface area, tunable morphologies, structural variety. MOFs are generated by the self-assembly of multidentate organic ligands connecting with adjacent multivalent metal nodes via strong coordination interactions, producing stable and highly ordered crystalline porous materials with well-designed structures. However, most MOFs intrinsically exhibit low electrical conductivity. To improve this property, MOFs can be combined with organic and inorganic materials in a hybrid fashion to produce composite materials or can be transformed into more stable structures. MOFs, indeed, can be employed as the precursors of metal oxides with well-designed architectures via the calcination method. The MOF-derived MOx partially preserved the original structure with high surface area and intrinsic open pores, which act as trapping centers for gas molecules, and showed a higher electrical conductivity. Core-shell heterostructures, in which the surface of a metal oxide core is completely coated by a MOF shell, forming a junction at the core-shell heterointerface, can also be synthesized. Also, nanocomposite in which MOF structures are intercalated with graphene related materials can also be produced, and the conductivity increases thanks to the high mobility of electrons of carbon materials. As MOF structures, zinc-based MOFs belonging to the ZIF family were selected in this work. Several Zn-based materials based and/or derived from MOFs were produced, structurally characterized, and arranged in a chemo resistive architecture, also exploring the potentiality of different approaches of sensing layer deposition based on PLD (pulsed laser deposition) and, in case of thermally labile materials, MAPLE (Matrix Assisted Pulsed Laser Evaporation) to enhance the adhesion to the support. The sensors were tested in a controlled humidity chamber, allowing for the possibility of varying the concentration of ethanol, a typical analyte chosen among the VOCs for a first survey. The effect of heating the chemiresistor to improve sensing performances was also explored. Future research will focus on exploring new manufacturing processes for MOF-based gas sensors with the aim to improve sensitivity, selectivity and reduce operating temperatures.Keywords: chemiresistors, gas sensors, graphene related materials, laser deposition, MAPLE, metal-organic frameworks, metal oxides, nanocomposites, sensing performance, transduction mechanism, volatile organic compounds
Procedia PDF Downloads 6325354 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence
Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno
Abstract:
Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index
Procedia PDF Downloads 16825353 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process
Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma
Abstract:
As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis
Procedia PDF Downloads 10025352 An Assessment of Finite Element Computations in the Structural Analysis of Diverse Coronary Stent Types: Identifying Prerequisites for Advancement
Authors: Amir Reza Heydari, Yaser Jenab
Abstract:
Coronary artery disease, a common cardiovascular disease, is attributed to the accumulation of cholesterol-based plaques in the coronary arteries, leading to atherosclerosis. This disease is associated with risk factors such as smoking, hypertension, diabetes, and elevated cholesterol levels, contributing to severe clinical consequences, including acute coronary syndromes and myocardial infarction. Treatment approaches such as from lifestyle interventions to surgical procedures like percutaneous coronary intervention and coronary artery bypass surgery. These interventions often employ stents, including bare-metal stents (BMS), drug-eluting stents (DES), and bioresorbable vascular scaffolds (BVS), each with its advantages and limitations. Computational tools have emerged as critical in optimizing stent designs and assessing their performance. The aim of this study is to provide an overview of the computational methods of studies based on the finite element (FE) method in the field of coronary stenting and discuss the potential for development and clinical application of stent devices. Additionally, the importance of assessing the ability of computational models is emphasized to represent real-world phenomena, supported by recent guidelines from the American Society of Mechanical Engineers (ASME). Validation processes proposed include comparing model performance with in vivo, ex-vivo, or in vitro data, alongside uncertainty quantification and sensitivity analysis. These methods can enhance the credibility and reliability of in silico simulations, ultimately aiding in the assessment of coronary stent designs in various clinical contexts.Keywords: atherosclerosis, materials, restenosis, review, validation
Procedia PDF Downloads 9125351 Energy Consumption Estimation for Hybrid Marine Power Systems: Comparing Modeling Methodologies
Authors: Kamyar Maleki Bagherabadi, Torstein Aarseth Bø, Truls Flatberg, Olve Mo
Abstract:
Hydrogen fuel cells and batteries are one of the promising solutions aligned with carbon emission reduction goals for the marine sector. However, the higher installation and operation costs of hydrogen-based systems compared to conventional diesel gensets raise questions about the appropriate hydrogen tank size, energy, and fuel consumption estimations. Ship designers need methodologies and tools to calculate energy and fuel consumption for different component sizes to facilitate decision-making regarding feasibility and performance for retrofits and design cases. The aim of this work is to compare three alternative modeling approaches for the estimation of energy and fuel consumption with various hydrogen tank sizes, battery capacities, and load-sharing strategies. A fishery vessel is selected as an example, using logged load demand data over a year of operations. The modeled power system consists of a PEM fuel cell, a diesel genset, and a battery. The methodologies used are: first, an energy-based model; second, considering load variations during the time domain with a rule-based Power Management System (PMS); and third, a load variations model and dynamic PMS strategy based on optimization with perfect foresight. The errors and potentials of the methods are discussed, and design sensitivity studies for this case are conducted. The results show that the energy-based method can estimate fuel and energy consumption with acceptable accuracy. However, models that consider time variation of the load provide more realistic estimations of energy and fuel consumption regarding hydrogen tank and battery size, still within low computational time.Keywords: fuel cell, battery, hydrogen, hybrid power system, power management system
Procedia PDF Downloads 3625350 Tool Development for Assessing Antineoplastic Drugs Surface Contamination in Healthcare Services and Other Workplaces
Authors: Benoit Atge, Alice Dhersin, Oscar Da Silva Cacao, Beatrice Martinez, Dominique Ducint, Catherine Verdun-Esquer, Isabelle Baldi, Mathieu Molimard, Antoine Villa, Mireille Canal-Raffin
Abstract:
Introduction: Healthcare workers' exposure to antineoplastic drugs (AD) is a burning issue for occupational medicine practitioners. Biological monitoring of occupational exposure (BMOE) is an essential tool for assessing AD contamination of healthcare workers. In addition to BMOE, surface sampling is a useful tool in order to understand how workers get contaminated, to identify sources of environmental contamination, to verify the effectiveness of surface decontamination way and to ensure monitoring of these surfaces. The objective of this work was to develop a complete tool including a kit for surface sampling and a quantification analytical method for AD traces detection. The development was realized with the three following criteria: the kit capacity to sample in every professional environment (healthcare services, veterinaries, etc.), the detection of very low AD traces with a validated analytical method and the easiness of the sampling kit use regardless of the person in charge of sampling. Material and method: AD mostly used in term of quantity and frequency have been identified by an analysis of the literature and consumptions of different hospitals, veterinary services, and home care settings. The kind of adsorbent device, surface moistening solution and mix of solvents for the extraction of AD from the adsorbent device have been tested for a maximal yield. The AD quantification was achieved by an ultra high-performance liquid chromatography method coupled with tandem mass spectrometry (UHPLC-MS/MS). Results: With their high frequencies of use and their good reflect of the diverse activities through healthcare, 15 AD (cyclophosphamide, ifosfamide, doxorubicin, daunorubicin, epirubicin, 5-FU, dacarbazin, etoposide, pemetrexed, vincristine, cytarabine, methothrexate, paclitaxel, gemcitabine, mitomycin C) were selected. The analytical method was optimized and adapted to obtain high sensitivity with very low limits of quantification (25 to 5000ng/mL), equivalent or lowest that those previously published (for 13/15 AD). The sampling kit is easy to use, provided with a didactic support (online video and protocol paper). It showed its effectiveness without inter-individual variation (n=5/person; n= 5 persons; p=0,85; ANOVA) regardless of the person in charge of sampling. Conclusion: This validated tool (sampling kit + analytical method) is very sensitive, easy to use and very didactic in order to control the chemical risk brought by AD. Moreover, BMOE permits a focal prevention. Used in routine, this tool is available for every intervention of occupational health.Keywords: surface contamination, sampling kit, analytical method, sensitivity
Procedia PDF Downloads 13225349 Database Management System for Orphanages to Help Track of Orphans
Authors: Srivatsav Sanjay Sridhar, Asvitha Raja, Prathit Kalra, Soni Gupta
Abstract:
Database management is a system that keeps track of details about a person in an organisation. Not a lot of orphanages these days are shifting to a computer and program-based system, but unfortunately, most have only pen and paper-based records, which not only consumes space but it is also not eco-friendly. It comes as a hassle when one has to view a record of a person as they have to search through multiple records, and it will consume time. This program will organise all the data and can pull out any information about anyone whose data is entered. This is also a safe way of storage as physical data gets degraded over time or, worse, destroyed due to natural disasters. In this developing world, it is only smart enough to shift all data to an electronic-based storage system. The program comes with all features, including creating, inserting, searching, and deleting the data, as well as printing them.Keywords: database, orphans, programming, C⁺⁺
Procedia PDF Downloads 15625348 Investigations on Pyrolysis Model for Radiatively Dominant Diesel Pool Fire Using Fire Dynamic Simulator
Authors: Siva K. Bathina, Sudheer Siddapureddy
Abstract:
Pool fires are formed when the flammable liquid accidentally spills on the ground or water and ignites. Pool fire is a kind of buoyancy-driven and diffusion flame. There have been many pool fire accidents caused during processing, handling and storing of liquid fuels in chemical and oil industries. Such kind of accidents causes enormous damage to property as well as the loss of lives. Pool fires are complex in nature due to the strong interaction among the combustion, heat and mass transfers and pyrolysis at the fuel surface. Moreover, the experimental study of such large complex fires involves fire safety issues and difficulties in performing experiments. In the present work, large eddy simulations are performed to study such complex fire scenarios using fire dynamic simulator. A 1 m diesel pool fire is considered for the studied cases, and diesel is chosen as it is most commonly involved fuel in fire accidents. Fire simulations are performed by specifying two different boundary conditions: one the fuel is in liquid state and pyrolysis model is invoked, and the other by assuming the fuel is initially in a vapor state and thereby prescribing the mass loss rate. A domain of size 11.2 m × 11.2 m × 7.28 m with uniform structured grid is chosen for the numerical simulations. Grid sensitivity analysis is performed, and a non-dimensional grid size of 12 corresponding to 8 cm grid size is considered. Flame properties like mass burning rate, irradiance, and time-averaged axial flame temperature profile are predicted. The predicted steady-state mass burning rate is 40 g/s and is within the uncertainty limits of the previously reported experimental data (39.4 g/s). Though the profile of the irradiance at a distance from the fire along the height is somewhat in line with the experimental data and the location of the maximum value of irradiance is shifted to a higher location. This may be due to the lack of sophisticated models for the species transportation along with combustion and radiation in the continuous zone. Furthermore, the axial temperatures are not predicted well (for any of the boundary conditions) in any of the zones. The present study shows that the existing models are not sufficient enough for modeling blended fuels like diesel. The predictions are strongly dependent on the experimental values of the soot yield. Future experiments are necessary for generalizing the soot yield for different fires.Keywords: burning rate, fire accidents, fire dynamic simulator, pyrolysis
Procedia PDF Downloads 19625347 Comparison and Validation of a dsDNA biomimetic Quality Control Reference for NGS based BRCA CNV analysis versus MLPA
Authors: A. Delimitsou, C. Gouedard, E. Konstanta, A. Koletis, S. Patera, E. Manou, K. Spaho, S. Murray
Abstract:
Background: There remains a lack of International Standard Control Reference materials for Next Generation Sequencing-based approaches or device calibration. We have designed and validated dsDNA biomimetic reference materials for targeted such approaches incorporating proprietary motifs (patent pending) for device/test calibration. They enable internal single-sample calibration, alleviating sample comparisons to pooled historical population-based data assembly or statistical modelling approaches. We have validated such an approach for BRCA Copy Number Variation analytics using iQRS™-CNVSUITE versus Mixed Ligation-dependent Probe Amplification. Methods: Standard BRCA Copy Number Variation analysis was compared between mixed ligation-dependent probe amplification and next generation sequencing using a cohort of 198 breast/ovarian cancer patients. Next generation sequencing based copy number variation analysis of samples spiked with iQRS™ dsDNA biomimetics were analysed using proprietary CNVSUITE software. Mixed ligation-dependent probe amplification analyses were performed on an ABI-3130 Sequencer and analysed with Coffalyser software. Results: Concordance of BRCA – copy number variation events for mixed ligation-dependent probe amplification and CNVSUITE indicated an overall sensitivity of 99.88% and specificity of 100% for iQRS™-CNVSUITE. The negative predictive value of iQRS-CNVSUITE™ for BRCA was 100%, allowing for accurate exclusion of any event. The positive predictive value was 99.88%, with no discrepancy between mixed ligation-dependent probe amplification and iQRS™-CNVSUITE. For device calibration purposes, precision was 100%, spiking of patient DNA demonstrated linearity to 1% (±2.5%) and range from 100 copies. Traditional training was supplemented by predefining the calibrator to sample cut-off (lock-down) for amplicon gain or loss based upon a relative ratio threshold, following training of iQRS™-CNVSUITE using spiked iQRS™ calibrator and control mocks. BRCA copy number variation analysis using iQRS™-CNVSUITE™ was successfully validated and ISO15189 accredited and now enters CE-IVD performance evaluation. Conclusions: The inclusion of a reference control competitor (iQRS™ dsDNA mimetic) to next generation sequencing-based sequencing offers a more robust sample-independent approach for the assessment of copy number variation events compared to mixed ligation-dependent probe amplification. The approach simplifies data analyses, improves independent sample data analyses, and allows for direct comparison to an internal reference control for sample-specific quantification. Our iQRS™ biomimetic reference materials allow for single sample copy number variation analytics and further decentralisation of diagnostics to single patient sample assessment.Keywords: validation, diagnostics, oncology, copy number variation, reference material, calibration
Procedia PDF Downloads 6625346 Application of EEG Wavelet Power to Prediction of Antidepressant Treatment Response
Authors: Dorota Witkowska, Paweł Gosek, Lukasz Swiecicki, Wojciech Jernajczyk, Bruce J. West, Miroslaw Latka
Abstract:
In clinical practice, the selection of an antidepressant often degrades to lengthy trial-and-error. In this work we employ a normalized wavelet power of alpha waves as a biomarker of antidepressant treatment response. This novel EEG metric takes into account both non-stationarity and intersubject variability of alpha waves. We recorded resting, 19-channel EEG (closed eyes) in 22 inpatients suffering from unipolar (UD, n=10) or bipolar (BD, n=12) depression. The EEG measurement was done at the end of the short washout period which followed previously unsuccessful pharmacotherapy. The normalized alpha wavelet power of 11 responders was markedly different than that of 11 nonresponders at several, mostly temporoparietal sites. Using the prediction of treatment response based on the normalized alpha wavelet power, we achieved 81.8% sensitivity and 81.8% specificity for channel T4.Keywords: alpha waves, antidepressant, treatment outcome, wavelet
Procedia PDF Downloads 31525345 Simulation of Behaviour Dynamics and Optimization of the Energy System
Authors: Iva Dvornik, Sandro Božić, Žana Božić Brkić
Abstract:
System-dynamic simulating modelling is one of the most appropriate and successful scientific methods of the complex, non-linear, natural, technical and organizational systems. In the recent practice its methodology proved to be efficient in solving the problems of control, behavior, sensitivity and flexibility of the system dynamics behavior having a high degree of complexity, all these by computing simulation i.e. “under laboratory conditions” what means without any danger for observed realities. This essay deals with the research of the gas turbine dynamic process as well as the operating pump units and transformation of gas energy into hydraulic energy has been simulated. In addition, system mathematical model has been also researched (gas turbine- centrifugal pumps – pipeline pressure system – storage vessel).Keywords: system dynamics, modelling, centrifugal pump, turbine, gases, continuous and discrete simulation, heuristic optimisation
Procedia PDF Downloads 10825344 Magnesium Alloys for Biomedical Applications Processed by Severe Plastic Deformation
Authors: Mariana P. Medeiros, Amanda P. Carvallo, Augusta Isaac, Milos Janecek, Peter Minarik, Mayerling Martinez Celis, Roberto. R. Figueiredo
Abstract:
The effect of high pressure torsion processing on mechanical properties and corrosion behavior of pure magnesium and Mg-Zn, Mg-Zn-Ca, Mg-Li-Y, and Mg-Y-RE alloys is investigated. Micro-tomography and SEM characterization are used to estimate corrosion rate and evaluate non-uniform corrosion features. The results show the severe plastic deformation processing improves the strength of all magnesium alloys, but deformation localization can take place in the Mg-Zn-Ca and Mg-Y-RE alloys. The occurrence of deformation localization is associated with low strain rate sensitivity in these alloys and with severe corrosion localization. Pure magnesium and Mg-Zn and Mg-Li-Y alloys display good corrosion resistance with low corrosion rate and maintained integrity after 28 days of immersion in Hank`s solution.Keywords: magnesium alloys, severe plastic deformation, corrosion, biodegradable alloys
Procedia PDF Downloads 11225343 New Two-Way Map-Reduce Join Algorithm: Hash Semi Join
Authors: Marwa Hussein Mohamed, Mohamed Helmy Khafagy, Samah Ahmed Senbel
Abstract:
Map Reduce is a programming model used to handle and support massive data sets. Rapidly increasing in data size and big data are the most important issue today to make an analysis of this data. map reduce is used to analyze data and get more helpful information by using two simple functions map and reduce it's only written by the programmer, and it includes load balancing , fault tolerance and high scalability. The most important operation in data analysis are join, but map reduce is not directly support join. This paper explains two-way map-reduce join algorithm, semi-join and per split semi-join, and proposes new algorithm hash semi-join that used hash table to increase performance by eliminating unused records as early as possible and apply join using hash table rather than using map function to match join key with other data table in the second phase but using hash tables isn't affecting on memory size because we only save matched records from the second table only. Our experimental result shows that using a hash table with hash semi-join algorithm has higher performance than two other algorithms while increasing the data size from 10 million records to 500 million and running time are increased according to the size of joined records between two tables.Keywords: map reduce, hadoop, semi join, two way join
Procedia PDF Downloads 51325342 Using Implicit Data to Improve E-Learning Systems
Authors: Slah Alsaleh
Abstract:
In the recent years and with popularity of internet and technology, e-learning became a major part of majority of education systems. One of the advantages the e-learning systems provide is the large amount of information available about the students' behavior while communicating with the e-learning system. Such information is very rich and it can be used to improve the capability and efficiency of e-learning systems. This paper discusses how e-learning can benefit from implicit data in different ways including; creating homogeneous groups of student, evaluating students' learning, creating behavior profiles for students and identifying the students through their behaviors.Keywords: e-learning, implicit data, user behavior, data mining
Procedia PDF Downloads 30925341 Enabling Quantitative Urban Sustainability Assessment with Big Data
Authors: Changfeng Fu
Abstract:
Sustainable urban development has been widely accepted a common sense in the modern urban planning and design. However, the measurement and assessment of urban sustainability, especially the quantitative assessment have been always an issue obsessing planning and design professionals. This paper will present an on-going research on the principles and technologies to develop a quantitative urban sustainability assessment principles and techniques which aim to integrate indicators, geospatial and geo-reference data, and assessment techniques together into a mechanism. It is based on the principles and techniques of geospatial analysis with GIS and statistical analysis methods. The decision-making technologies and methods such as AHP and SMART are also adopted to address overall assessment conclusions. The possible interfaces and presentation of data and quantitative assessment results are also described. This research is based on the knowledge, situations and data sources of UK, but it is potentially adaptable to other countries or regions. The implementation potentials of the mechanism are also discussed.Keywords: urban sustainability assessment, quantitative analysis, sustainability indicator, geospatial data, big data
Procedia PDF Downloads 358