Search results for: transient data
24276 Unsteady and Steady State in Natural Convection
Authors: Syukri Himran, Erwin Eka Putra, Nanang Roni
Abstract:
This study explains the natural convection of viscous fluid flowing on semi-infinite vertical plate. A set of the governing equations describing the continuity, momentum and energy, have been reduced to dimensionless forms by introducing the references variables. To solve the problems, the equations are formulated by explicit finite-difference in time dependent form and computations are performed by Fortran program. The results describe velocity, temperature profiles both in transient and steady state conditions. An approximate value of heat transfer coefficient and the effects of Pr on convection flow are also presented.Keywords: natural convection, vertical plate, velocity and temperature profiles, steady and unsteady
Procedia PDF Downloads 46424275 Camptothecin Promotes ROS-Mediated G2/M Phase Cell Cycle Arrest, Resulting from Autophagy-Mediated Cytoprotection
Authors: Rajapaksha Gedara Prasad Tharanga Jayasooriya, Matharage Gayani Dilshara, Yung Hyun Choi, Gi-Young Kim
Abstract:
Camptothecin (CPT) is a quinolone alkaloid which inhibits DNA topoisomerase I that induces cytotoxicity in a variety of cancer cell lines. We previously showed that CPT effectively inhibited invasion of prostate cancer cells and also combined treatment with subtoxic doses of CPT and TNF-related apoptosis-inducing ligand (TRAIL) potentially enhanced apoptosis in a caspase-dependent manner in hepatoma cancer cells. Here, we found that treatment with CPT caused an irreversible cell cycle arrest in the G2/M phase. CPT-induced cell cycle arrest was associated with a decrease in protein levels of cell division cycle 25C (Cdc25C) and increased the level of cyclin B and p21. The CPT-induced decrease in Cdc25C was blocked in the presence of proteasome inhibitor MG132, thus reversed the cell cycle arrest. In addition to that treatment of CPT-increased phosphorylation of Cdc25C was the resulted of activation of checkpoint kinase 2 (Chk2), which was associated with phosphorylation of ataxia telangiectasia-mutated. Interestingly CPT induced G2/M phase of the cell cycle arrest is reactive oxygen species (ROS) dependent where ROS inhibitors NAC and GSH reversed the CPT-induced cell cycle arrest. These results further confirm by using transient knockdown of nuclear factor-erythroid 2-related factor 2 (Nrf2) since it regulates the production of ROS. Our data reveal that treatment of siNrf2 increased the ROS level as well as further increased the CPT induce G2/M phase cell cycle arrest. Our data also indicate CPT-enhanced cell cycle arrest through the extracellular signal-regulated kinase (ERK) and the c-Jun N-terminal kinase (JNK) pathway. Inhibitors of ERK and JNK more decreased the Cdc25C expression and protein expression of p21 and cyclin B. These findings indicate that Chk2-mediated phosphorylation of Cdc25C plays a major role in G2/M arrest by CPT.Keywords: camptothecin, cell cycle, checkpoint kinase 2, nuclear factor-erythroid 2-related factor 2, reactive oxygen species
Procedia PDF Downloads 41124274 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions
Authors: K. Hardy, A. Maurushat
Abstract:
Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.Keywords: big data, open data, productivity, data governance
Procedia PDF Downloads 34224273 Unbranched, Saturated, Carboxylic Esters as Phase-Change Materials
Authors: Anastasia Stamatiou, Melissa Obermeyer, Ludger J. Fischer, Philipp Schuetz, Jörg Worlitschek
Abstract:
This study evaluates unbranched, saturated carboxylic esters with respect to their suitability to be used as storage media for latent heat storage applications. Important thermophysical properties are gathered both by means of literature research as well as by experimental measurements. Additionally, esters are critically evaluated against other common phase-change materials in terms of their environmental impact and their economic potential. The experimental investigations are performed for eleven selected ester samples with a focus on the determination of their melting temperature and their enthalpy of fusion using differential scanning calorimetry. Transient Hot Bridge was used to determine the thermal conductivity of the liquid samples while thermogravimetric analysis was employed for the evaluation of the 5% weight loss temperature as well as of the decomposition temperature of the non-volatile samples. Both experimental results and literature data reveal the high potential of esters as phase-change materials. Their good thermal and environmental properties as well as the possibility for production from natural sources (e.g. vegetable oils) render esters as very promising for future storage applications. A particularly high short term application potential of esters could lie in low temperature storage applications where the main alternative is using salt hydrates as phase-change material.Keywords: esters, phase-change materials, thermal properties, latent heat storage
Procedia PDF Downloads 38224272 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 8224271 A Systematic Review on Challenges in Big Data Environment
Authors: Rimmy Yadav, Anmol Preet Kaur
Abstract:
Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.Keywords: big data, privacy, data management, network and energy consumption
Procedia PDF Downloads 27724270 Survey on Big Data Stream Classification by Decision Tree
Authors: Mansoureh Ghiasabadi Farahani, Samira Kalantary, Sara Taghi-Pour, Mahboubeh Shamsi
Abstract:
Nowadays, the development of computers technology and its recent applications provide access to new types of data, which have not been considered by the traditional data analysts. Two particularly interesting characteristics of such data sets include their huge size and streaming nature .Incremental learning techniques have been used extensively to address the data stream classification problem. This paper presents a concise survey on the obstacles and the requirements issues classifying data streams with using decision tree. The most important issue is to maintain a balance between accuracy and efficiency, the algorithm should provide good classification performance with a reasonable time response.Keywords: big data, data streams, classification, decision tree
Procedia PDF Downloads 48824269 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication
Authors: Aishwarya Shekhar, Himanshu Sharma
Abstract:
Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.Keywords: confidentiality, deduplication, data compression, hybridity of cloud
Procedia PDF Downloads 35724268 A Review of Machine Learning for Big Data
Authors: Devatha Kalyan Kumar, Aravindraj D., Sadathulla A.
Abstract:
Big data are now rapidly expanding in all engineering and science and many other domains. The potential of large or massive data is undoubtedly significant, make sense to require new ways of thinking and learning techniques to address the various big data challenges. Machine learning is continuously unleashing its power in a wide range of applications. In this paper, the latest advances and advancements in the researches on machine learning for big data processing. First, the machine learning techniques methods in recent studies, such as deep learning, representation learning, transfer learning, active learning and distributed and parallel learning. Then focus on the challenges and possible solutions of machine learning for big data.Keywords: active learning, big data, deep learning, machine learning
Procedia PDF Downloads 40724267 Strengthening Legal Protection of Personal Data through Technical Protection Regulation in Line with Human Rights
Authors: Tomy Prihananto, Damar Apri Sudarmadi
Abstract:
Indonesia recognizes the right to privacy as a human right. Indonesia provides legal protection against data management activities because the protection of personal data is a part of human rights. This paper aims to describe the arrangement of data management and data management in Indonesia. This paper is a descriptive research with qualitative approach and collecting data from literature study. Results of this paper are comprehensive arrangement of data that have been set up as a technical requirement of data protection by encryption methods. Arrangements on encryption and protection of personal data are mutually reinforcing arrangements in the protection of personal data. Indonesia has two important and immediately enacted laws that provide protection for the privacy of information that is part of human rights.Keywords: Indonesia, protection, personal data, privacy, human rights, encryption
Procedia PDF Downloads 15324266 The Various Legal Dimensions of Genomic Data
Authors: Amy Gooden
Abstract:
When human genomic data is considered, this is often done through only one dimension of the law, or the interplay between the various dimensions is not considered, thus providing an incomplete picture of the legal framework. This research considers and analyzes the various dimensions in South African law applicable to genomic sequence data – including property rights, personality rights, and intellectual property rights. The effective use of personal genomic sequence data requires the acknowledgement and harmonization of the rights applicable to such data.Keywords: artificial intelligence, data, law, genomics, rights
Procedia PDF Downloads 11324265 Big Brain: A Single Database System for a Federated Data Warehouse Architecture
Authors: X. Gumara Rigol, I. Martínez de Apellaniz Anzuola, A. Garcia Serrano, A. Franzi Cros, O. Vidal Calbet, A. Al Maruf
Abstract:
Traditional federated architectures for data warehousing work well when corporations have existing regional data warehouses and there is a need to aggregate data at a global level. Schibsted Media Group has been maturing from a decentralised organisation into a more globalised one and needed to build both some of the regional data warehouses for some brands at the same time as the global one. In this paper, we present the architectural alternatives studied and why a custom federated approach was the notable recommendation to go further with the implementation. Although the data warehouses are logically federated, the implementation uses a single database system which presented many advantages like: cost reduction and improved data access to global users allowing consumers of the data to have a common data model for detailed analysis across different geographies and a flexible layer for local specific needs in the same place.Keywords: data integration, data warehousing, federated architecture, Online Analytical Processing (OLAP)
Procedia PDF Downloads 21524264 A Review Paper on Data Mining and Genetic Algorithm
Authors: Sikander Singh Cheema, Jasmeen Kaur
Abstract:
In this paper, the concept of data mining is summarized and its one of the important process i.e KDD is summarized. The data mining based on Genetic Algorithm is researched in and ways to achieve the data mining Genetic Algorithm are surveyed. This paper also conducts a formal review on the area of data mining tasks and genetic algorithm in various fields.Keywords: data mining, KDD, genetic algorithm, descriptive mining, predictive mining
Procedia PDF Downloads 56524263 Self-Assembled Laser-Activated Plasmonic Substrates for High-Throughput, High-Efficiency Intracellular Delivery
Authors: Marinna Madrid, Nabiha Saklayen, Marinus Huber, Nicolas Vogel, Christos Boutopoulos, Michel Meunier, Eric Mazur
Abstract:
Delivering material into cells is important for a diverse range of biological applications, including gene therapy, cellular engineering and imaging. We present a plasmonic substrate for delivering membrane-impermeable material into cells at high throughput and high efficiency while maintaining cell viability. The substrate fabrication is based on an affordable and fast colloidal self-assembly process. When illuminated with a femtosecond laser, the light interacts with the electrons at the surface of the metal substrate, creating localized surface plasmons that form bubbles via energy dissipation in the surrounding medium. These bubbles come into close contact with the cell membrane to form transient pores and enable entry of membrane-impermeable material via diffusion. We use fluorescence microscopy and flow cytometry to verify delivery of membrane-impermeable material into HeLa CCL-2 cells. We show delivery efficiency and cell viability data for a range of membrane-impermeable cargo, including dyes and biologically relevant material such as siRNA. We estimate the effective pore size by determining delivery efficiency for hard fluorescent spheres with diameters ranging from 20 nm to 2 um. To provide insight to the cell poration mechanism, we relate the poration data to pump-probe measurements of micro- and nano-bubble formation on the plasmonic substrate. Finally, we investigate substrate stability and reusability by using scanning electron microscopy (SEM) to inspect for damage on the substrate after laser treatment. SEM images show no visible damage. Our findings indicate that self-assembled plasmonic substrates are an affordable tool for high-throughput, high-efficiency delivery of material into mammalian cells.Keywords: femtosecond laser, intracellular delivery, plasmonic, self-assembly
Procedia PDF Downloads 50724262 Data-Mining Approach to Analyzing Industrial Process Information for Real-Time Monitoring
Authors: Seung-Lock Seo
Abstract:
This work presents a data-mining empirical monitoring scheme for industrial processes with partially unbalanced data. Measurement data of good operations are relatively easy to gather, but in unusual special events or faults it is generally difficult to collect process information or almost impossible to analyze some noisy data of industrial processes. At this time some noise filtering techniques can be used to enhance process monitoring performance in a real-time basis. In addition, pre-processing of raw process data is helpful to eliminate unwanted variation of industrial process data. In this work, the performance of various monitoring schemes was tested and demonstrated for discrete batch process data. It showed that the monitoring performance was improved significantly in terms of monitoring success rate of given process faults.Keywords: data mining, process data, monitoring, safety, industrial processes
Procedia PDF Downloads 37024261 A Fault-Tolerant Full Adder in Double Pass CMOS Transistor
Authors: Abdelmonaem Ayachi, Belgacem Hamdi
Abstract:
This paper presents a fault-tolerant implementation for adder schemes using the dual duplication code. To prove the efficiency of the proposed method, the circuit is simulated in double pass transistor CMOS 32nm technology and some transient faults are voluntary injected in the Layout of the circuit. This fully differential implementation requires only 20 transistors which mean that the proposed design involves 28.57% saving in transistor count compared to standard CMOS technology.Keywords: digital electronics, integrated circuits, full adder, 32nm CMOS tehnology, double pass transistor technology, fault toleance, self-checking
Procedia PDF Downloads 31624260 A Survey of Semantic Integration Approaches in Bioinformatics
Authors: Chaimaa Messaoudi, Rachida Fissoune, Hassan Badir
Abstract:
Technological advances of computer science and data analysis are helping to provide continuously huge volumes of biological data, which are available on the web. Such advances involve and require powerful techniques for data integration to extract pertinent knowledge and information for a specific question. Biomedical exploration of these big data often requires the use of complex queries across multiple autonomous, heterogeneous and distributed data sources. Semantic integration is an active area of research in several disciplines, such as databases, information-integration, and ontology. We provide a survey of some approaches and techniques for integrating biological data, we focus on those developed in the ontology community.Keywords: biological ontology, linked data, semantic data integration, semantic web
Procedia PDF Downloads 41724259 Left Atrial Appendage Occlusion vs Oral Anticoagulants in Atrial Fibrillation and Coronary Stenting. The DESAFIO Registry
Authors: José Ramón López-Mínguez, Estrella Suárez-Corchuelo, Sergio López-Tejero, Luis Nombela-Franco, Xavier Freixa-Rofastes, Guillermo Bastos-Fernández, Xavier Millán-Álvarez, Raúl Moreno-Gómez, José Antonio Fernández-Díaz, Ignacio Amat-Santos, Tomás Benito-González, Fernando Alfonso-Manterola, Pablo Salinas-Sanguino, Pedro Cepas-Guillén, Dabit Arzamendi, Ignacio Cruz-González, Juan Manuel Nogales-Asensio
Abstract:
Background and objectives: The treatment of patients with non-valvular atrial fibrillation (NVAF) who need coronary stenting is challenging. The objective of the study was to determine whether left atrial appendage occlusion (LAAO) could be a feasible option and benefit these patients. To this end, we studied the impact of LAAO plus antiplatelet drugs vs oral anticoagulants (OAC) (including direct OAC) plus antiplatelet drugs in these patients’ long-term outcomes. Methods: The results of 207 consecutive patients with NVAF who underwent coronary stenting were analyzed. A total of 146 patients were treated with OAC (75 with acenocoumarol, 71 with direct OAC) while 61 underwent LAAO. The median follow-up was 35 months. Patients also received antiplatelet therapy as prescribed by their cardiologist. The study received the proper ethical oversight. Results: Age (mean 75.7 years), and the past medical history of stroke were similar in both groups. However, the LAAO group had more unfavorable characteristics (history of coronary artery disease [CHA2DS2-VASc], and significant bleeding [BARC ≥ 2] and HAS-BLED). The occurrence of major adverse events (death, stroke/transient ischemic events, major bleeding) and major cardiovascular events (cardiac death, stroke/transient ischemic attack, and myocardial infarction) were significantly higher in the OAC group compared to the LAAO group: 19.75% vs 9.06% (HR, 2.18; P = .008) and 6.37% vs 1.91% (HR, 3.34; P = .037), respectively. Conclusions: In patients with NVAF undergoing coronary stenting, LAAO plus antiplatelet therapy produced better long-term outcomes compared to treatment with OAC plus antiplatelet therapy despite the unfavorable baseline characteristics of the LAAO group.Keywords: stents, atrial fibrillation, anticoagulants, left atrial appendage occlusion
Procedia PDF Downloads 2324258 Classification of Generative Adversarial Network Generated Multivariate Time Series Data Featuring Transformer-Based Deep Learning Architecture
Authors: Thrivikraman Aswathi, S. Advaith
Abstract:
As there can be cases where the use of real data is somehow limited, such as when it is hard to get access to a large volume of real data, we need to go for synthetic data generation. This produces high-quality synthetic data while maintaining the statistical properties of a specific dataset. In the present work, a generative adversarial network (GAN) is trained to produce multivariate time series (MTS) data since the MTS is now being gathered more often in various real-world systems. Furthermore, the GAN-generated MTS data is fed into a transformer-based deep learning architecture that carries out the data categorization into predefined classes. Further, the model is evaluated across various distinct domains by generating corresponding MTS data.Keywords: GAN, transformer, classification, multivariate time series
Procedia PDF Downloads 9124257 Generative AI: A Comparison of Conditional Tabular Generative Adversarial Networks and Conditional Tabular Generative Adversarial Networks with Gaussian Copula in Generating Synthetic Data with Synthetic Data Vault
Authors: Lakshmi Prayaga, Chandra Prayaga. Aaron Wade, Gopi Shankar Mallu, Harsha Satya Pola
Abstract:
Synthetic data generated by Generative Adversarial Networks and Autoencoders is becoming more common to combat the problem of insufficient data for research purposes. However, generating synthetic data is a tedious task requiring extensive mathematical and programming background. Open-source platforms such as the Synthetic Data Vault (SDV) and Mostly AI have offered a platform that is user-friendly and accessible to non-technical professionals to generate synthetic data to augment existing data for further analysis. The SDV also provides for additions to the generic GAN, such as the Gaussian copula. We present the results from two synthetic data sets (CTGAN data and CTGAN with Gaussian Copula) generated by the SDV and report the findings. The results indicate that the ROC and AUC curves for the data generated by adding the layer of Gaussian copula are much higher than the data generated by the CTGAN.Keywords: synthetic data generation, generative adversarial networks, conditional tabular GAN, Gaussian copula
Procedia PDF Downloads 3424256 Dynamic Analysis of the Heat Transfer in the Magnetically Assisted Reactor
Authors: Tomasz Borowski, Dawid Sołoducha, Rafał Rakoczy, Marian Kordas
Abstract:
The application of magnetic field is essential for a wide range of technologies or processes (i.e., magnetic hyperthermia, bioprocessing). From the practical point of view, bioprocess control is often limited to the regulation of temperature at constant values favourable to microbial growth. The main aim of this study is to determine the effect of various types of electromagnetic fields (i.e., static or alternating) on the heat transfer in a self-designed magnetically assisted reactor. The experimental set-up is equipped with a measuring instrument which controlled the temperature of the liquid inside the container and supervised the real-time acquisition of all the experimental data coming from the sensors. Temperature signals are also sampled from generator of magnetic field. The obtained temperature profiles were mathematically described and analyzed. The parameters characterizing the response to a step input of a first-order dynamic system were obtained and discussed. For example, the higher values of the time constant means slow signal (in this case, temperature) increase. After the period equal to about five-time constants, the sample temperature nearly reached the asymptotic value. This dynamical analysis allowed us to understand the heating effect under the action of various types of electromagnetic fields. Moreover, the proposed mathematical description can be used to compare the influence of different types of magnetic fields on heat transfer operations.Keywords: heat transfer, magnetically assisted reactor, dynamical analysis, transient function
Procedia PDF Downloads 14724255 Long Term Survival after a First Transient Ischemic Attack in England: A Case-Control Study
Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski
Abstract:
Transient ischaemic attacks (TIAs) are warning signs for future strokes. TIA patients are at increased risk of stroke and cardio-vascular events after a first episode. A majority of studies on TIA focused on the occurrence of these ancillary events after a TIA. Long-term mortality after TIA received only limited attention. We undertook this study to determine the long-term hazards of all-cause mortality following a first episode of a TIA using anonymised electronic health records (EHRs). We used a retrospective case-control study using electronic primary health care records from The Health Improvement Network (THIN) database. Patients born prior to or in year 1960, resident in England, with a first diagnosis of TIA between January 1986 and January 2017 were matched to three controls on age, sex and general medical practice. The primary outcome was all-cause mortality. The hazards of all-cause mortality were estimated using a time-varying Weibull-Cox survival model which included both scale and shape effects and a random frailty effect of GP practice. 20,633 cases and 58,634 controls were included. Cases aged 39 to 60 years at the first TIA event had the highest hazard ratio (HR) of mortality compared to matched controls (HR = 3.04, 95% CI (2.91 - 3.18)). The HRs for cases aged 61-70 years, 71-76 years and 77+ years were 1.98 (1.55 - 2.30), 1.79 (1.20 - 2.07) and 1.52 (1.15 - 1.97) compared to matched controls. Aspirin provided long-term survival benefits to cases. Cases aged 39-60 years on aspirin had HR of 0.93 (0.84 - 1.00), 0.90 (0.82 - 0.98) and 0.88 (0.80 - 0.96) at 5 years, 10 years and 15 years, respectively, compared to cases in the same age group who were not on antiplatelets. Similar beneficial effects of aspirin were observed in other age groups. There were no significant survival benefits with other antiplatelet options. No survival benefits of antiplatelet drugs were observed in controls. Our study highlights the excess long-term risk of death of TIA patients and cautions that TIA should not be treated as a benign condition. The study further recommends aspirin as the better option for secondary prevention for TIA patients compared to clopidogrel recommended by NICE guidelines. Management of risk factors and treatment strategies should be important challenges to reduce the burden of disease.Keywords: dual antiplatelet therapy (DAPT), General Practice, Multiple Imputation, The Health Improvement Network(THIN), hazard ratio (HR), Weibull-Cox model
Procedia PDF Downloads 11924254 Predicting Loss of Containment in Surface Pipeline using Computational Fluid Dynamics and Supervised Machine Learning Model to Improve Process Safety in Oil and Gas Operations
Authors: Muhammmad Riandhy Anindika Yudhy, Harry Patria, Ramadhani Santoso
Abstract:
Loss of containment is the primary hazard that process safety management is concerned within the oil and gas industry. Escalation to more serious consequences all begins with the loss of containment, starting with oil and gas release from leakage or spillage from primary containment resulting in pool fire, jet fire and even explosion when reacted with various ignition sources in the operations. Therefore, the heart of process safety management is avoiding loss of containment and mitigating its impact through the implementation of safeguards. The most effective safeguard for the case is an early detection system to alert Operations to take action prior to a potential case of loss of containment. The detection system value increases when applied to a long surface pipeline that is naturally difficult to monitor at all times and is exposed to multiple causes of loss of containment, from natural corrosion to illegal tapping. Based on prior researches and studies, detecting loss of containment accurately in the surface pipeline is difficult. The trade-off between cost-effectiveness and high accuracy has been the main issue when selecting the traditional detection method. The current best-performing method, Real-Time Transient Model (RTTM), requires analysis of closely positioned pressure, flow and temperature (PVT) points in the pipeline to be accurate. Having multiple adjacent PVT sensors along the pipeline is expensive, hence generally not a viable alternative from an economic standpoint.A conceptual approach to combine mathematical modeling using computational fluid dynamics and a supervised machine learning model has shown promising results to predict leakage in the pipeline. Mathematical modeling is used to generate simulation data where this data is used to train the leak detection and localization models. Mathematical models and simulation software have also been shown to provide comparable results with experimental data with very high levels of accuracy. While the supervised machine learning model requires a large training dataset for the development of accurate models, mathematical modeling has been shown to be able to generate the required datasets to justify the application of data analytics for the development of model-based leak detection systems for petroleum pipelines. This paper presents a review of key leak detection strategies for oil and gas pipelines, with a specific focus on crude oil applications, and presents the opportunities for the use of data analytics tools and mathematical modeling for the development of robust real-time leak detection and localization system for surface pipelines. A case study is also presented.Keywords: pipeline, leakage, detection, AI
Procedia PDF Downloads 14624253 Numerical Simulation of Sloshing Control Using Input Shaping
Authors: Dongjoo Kim
Abstract:
Effective control of sloshing in a liquid container is an important issue to be resolved in many applications. In this study, numerical simulations are performed to design the velocity profile of rectangular container and investigate the effectiveness of input shaping for sloshing control. Trapezoidal profiles of container velocity are chosen to be reference commands and they are convolved with a series of impulses to generate shaped ones that induce minimal residual oscillations. The performances of several input shapers are compared from the viewpoint of transient peak and residual oscillations of sloshing. Results show that sloshing can be effectively controlled by input shaping (Supported by the NRF programs, NRF-2015R1D1A1A01059675, of Korean government).Keywords: input shaping, rectangular container, sloshing, trapezoidal profile
Procedia PDF Downloads 23224252 A Privacy Protection Scheme Supporting Fuzzy Search for NDN Routing Cache Data Name
Authors: Feng Tao, Ma Jing, Guo Xian, Wang Jing
Abstract:
Named Data Networking (NDN) replaces IP address of traditional network with data name, and adopts dynamic cache mechanism. In the existing mechanism, however, only one-to-one search can be achieved because every data has a unique name corresponding to it. There is a certain mapping relationship between data content and data name, so if the data name is intercepted by an adversary, the privacy of the data content and user’s interest can hardly be guaranteed. In order to solve this problem, this paper proposes a one-to-many fuzzy search scheme based on order-preserving encryption to reduce the query overhead by optimizing the caching strategy. In this scheme, we use hash value to ensure the user’s query safe from each node in the process of search, so does the privacy of the requiring data content.Keywords: NDN, order-preserving encryption, fuzzy search, privacy
Procedia PDF Downloads 44924251 Healthcare Big Data Analytics Using Hadoop
Authors: Chellammal Surianarayanan
Abstract:
Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare
Procedia PDF Downloads 38124250 Seismic Impact and Design on Buried Pipelines
Authors: T. Schmitt, J. Rosin, C. Butenweg
Abstract:
Seismic design of buried pipeline systems for energy and water supply is not only important for plant and operational safety, but in particular for the maintenance of supply infrastructure after an earthquake. Past earthquakes have shown the vulnerability of pipeline systems. After the Kobe earthquake in Japan in 1995 for instance, in some regions the water supply was interrupted for almost two months. The present paper shows special issues of the seismic wave impacts on buried pipelines, describes calculation methods, proposes approaches and gives calculation examples. Buried pipelines are exposed to different effects of seismic impacts. This paper regards the effects of transient displacement differences and resulting tensions within the pipeline due to the wave propagation of the earthquake. Other effects are permanent displacements due to fault rupture displacements at the surface, soil liquefaction, landslides and seismic soil compaction. The presented model can also be used to calculate fault rupture induced displacements. Based on a three-dimensional Finite Element Model parameter studies are performed to show the influence of several parameters such as incoming wave angle, wave velocity, soil depth and selected displacement time histories. In the computer model, the interaction between the pipeline and the surrounding soil is modeled with non-linear soil springs. A propagating wave is simulated affecting the pipeline punctually independently in time and space. The resulting stresses mainly are caused by displacement differences of neighboring pipeline segments and by soil-structure interaction. The calculation examples focus on pipeline bends as the most critical parts. Special attention is given to the calculation of long-distance heat pipeline systems. Here, in regular distances expansion bends are arranged to ensure movements of the pipeline due to high temperature. Such expansion bends are usually designed with small bending radii, which in the event of an earthquake lead to high bending stresses at the cross-section of the pipeline. Therefore, Karman's elasticity factors, as well as the stress intensity factors for curved pipe sections, must be taken into account. The seismic verification of the pipeline for wave propagation in the soil can be achieved by observing normative strain criteria. Finally, an interpretation of the results and recommendations are given taking into account the most critical parameters.Keywords: buried pipeline, earthquake, seismic impact, transient displacement
Procedia PDF Downloads 16224249 Data Disorders in Healthcare Organizations: Symptoms, Diagnoses, and Treatments
Authors: Zakieh Piri, Shahla Damanabi, Peyman Rezaii Hachesoo
Abstract:
Introduction: Healthcare organizations like other organizations suffer from a number of disorders such as Business Sponsor Disorder, Business Acceptance Disorder, Cultural/Political Disorder, Data Disorder, etc. As quality in healthcare care mostly depends on the quality of data, we aimed to identify data disorders and its symptoms in two teaching hospitals. Methods: Using a self-constructed questionnaire, we asked 20 questions in related to quality and usability of patient data stored in patient records. Research population consisted of 150 managers, physicians, nurses, medical record staff who were working at the time of study. We also asked their views about the symptoms and treatments for any data disorders they mentioned in the questionnaire. Using qualitative methods we analyzed the answers. Results: After classifying the answers, we found six main data disorders: incomplete data, missed data, late data, blurred data, manipulated data, illegible data. The majority of participants believed in their important roles in treatment of data disorders while others believed in health system problems. Discussion: As clinicians have important roles in producing of data, they can easily identify symptoms and disorders of patient data. Health information managers can also play important roles in early detection of data disorders by proactively monitoring and periodic check-ups of data.Keywords: data disorders, quality, healthcare, treatment
Procedia PDF Downloads 40624248 Big Data and Analytics in Higher Education: An Assessment of Its Status, Relevance and Future in the Republic of the Philippines
Authors: Byron Joseph A. Hallar, Annjeannette Alain D. Galang, Maria Visitacion N. Gumabay
Abstract:
One of the unique challenges provided by the twenty-first century to Philippine higher education is the utilization of Big Data. The higher education system in the Philippines is generating burgeoning amounts of data that contains relevant data that can be used to generate the information and knowledge needed for accurate data-driven decision making. This study examines the status, relevance and future of Big Data and Analytics in Philippine higher education. The insights gained from the study may be relevant to other developing nations similarly situated as the Philippines.Keywords: big data, data analytics, higher education, republic of the philippines, assessment
Procedia PDF Downloads 31124247 Data Management and Analytics for Intelligent Grid
Authors: G. Julius P. Roy, Prateek Saxena, Sanjeev Singh
Abstract:
Power distribution utilities two decades ago would collect data from its customers not later than a period of at least one month. The origin of SmartGrid and AMI has subsequently increased the sampling frequency leading to 1000 to 10000 fold increase in data quantity. This increase is notable and this steered to coin the tern Big Data in utilities. Power distribution industry is one of the largest to handle huge and complex data for keeping history and also to turn the data in to significance. Majority of the utilities around the globe are adopting SmartGrid technologies as a mass implementation and are primarily focusing on strategic interdependence and synergies of the big data coming from new information sources like AMI and intelligent SCADA, there is a rising need for new models of data management and resurrected focus on analytics to dissect data into descriptive, predictive and dictatorial subsets. The goal of this paper is to is to bring load disaggregation into smart energy toolkit for commercial usage.Keywords: data management, analytics, energy data analytics, smart grid, smart utilities
Procedia PDF Downloads 754