Search results for: bioinformatics pipeline
243 An Integrated Web-Based Workflow System for Design of Computational Pipelines in the Cloud
Authors: Shuen-Tai Wang, Yu-Ching Lin
Abstract:
With more and more workflow systems adopting cloud as their execution environment, it presents various challenges that need to be addressed in order to be utilized efficiently. This paper introduces a method for resource provisioning based on our previous research of dynamic allocation and its pipeline processes. We present an abstraction for workload scheduling in which independent tasks get scheduled among various available processors of distributed computing for optimization. We also propose an integrated web-based workflow designer by taking advantage of the HTML5 technology and chaining together multiple tools. In order to make the combination of multiple pipelines executing on the cloud in parallel, we develop a script translator and an execution engine for workflow management in the cloud. All information is known in advance by the workflow engine and tasks are allocated according to the prior knowledge in the repository. This proposed effort has the potential to provide support for process definition, workflow enactment and monitoring of workflow processes. Users would benefit from the web-based system that allows creation and execution of pipelines without scripting knowledge.Keywords: workflow systems, resources provisioning, workload scheduling, web-based, workflow engine
Procedia PDF Downloads 160242 High Level Synthesis of Canny Edge Detection Algorithm on Zynq Platform
Authors: Hanaa M. Abdelgawad, Mona Safar, Ayman M. Wahba
Abstract:
Real-time image and video processing is a demand in many computer vision applications, e.g. video surveillance, traffic management and medical imaging. The processing of those video applications requires high computational power. Therefore, the optimal solution is the collaboration of CPU and hardware accelerators. In this paper, a Canny edge detection hardware accelerator is proposed. Canny edge detection is one of the common blocks in the pre-processing phase of image and video processing pipeline. Our presented approach targets offloading the Canny edge detection algorithm from processing system (PS) to programmable logic (PL) taking the advantage of High Level Synthesis (HLS) tool flow to accelerate the implementation on Zynq platform. The resulting implementation enables up to a 100x performance improvement through hardware acceleration. The CPU utilization drops down and the frame rate jumps to 60 fps of 1080p full HD input video stream.Keywords: high level synthesis, canny edge detection, hardware accelerators, computer vision
Procedia PDF Downloads 478241 Investigating the Successes of in vitro Embryogenesis
Authors: Zelikha Labbani
Abstract:
The in vitro isolated microspore culture is the most powerful androgenic pathway to produce doubled haploid plants in the short time. To deviate a microspore toward embryogenesis, a number of factors, different for each species, must concur at the same time and place. Once induced, the microspore undergoes numerous changes at different levels, from overall morphology to gene expression. Induction of microspore embryogenesis not only implies the expression of an embryogenic program, but also a stress-related cellular response and a repression of the gametophytic program to revert the microspore to a totipotent status. As haploid single cells, microspore became a strategy to achieve various objectives particularly in genetic engineering. In this communication we would show the most recent advances in the producing haploid embryos via in vitro isolated microspore culture.Keywords: in vitro isolated microspore culture, success, haploid cells, bioinformatics, biomedicine
Procedia PDF Downloads 475240 A Comprehensive Analysis of LACK (Leishmania Homologue of Receptors for Activated C Kinase) in the Context of Visceral Leishmaniasis
Authors: Sukrat Sinha, Abhay Kumar, Shanthy Sundaram
Abstract:
The Leishmania homologue of activated C kinase (LACK) is known T cell epitope from soluble Leishmania antigens (SLA) that confers protection against Leishmania challenge. This antigen has been found to be highly conserved among Leishmania strains. LACK has been shown to be protective against L. donovani challenge. A comprehensive analysis of several LACK sequences was completed. The analysis shows a high level of conservation, lower variability and higher antigenicity in specific portions of the LACK protein. This information provides insights for the potential consideration of LACK as a putative candidate in the context of visceral Leishmaniasis vaccine target.Keywords: bioinformatics, genome assembly, leishmania activated protein kinase c (lack), next-generation sequencing
Procedia PDF Downloads 338239 A Study on Big Data Analytics, Applications and Challenges
Authors: Chhavi Rana
Abstract:
The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, Healthcare, and business intelligence contain voluminous and incremental data, which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organization's decision-making strategy can be enhanced using big data analytics and applying different machine learning techniques and statistical tools on such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates on various frameworks in the process of Analysis using different machine-learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.Keywords: big data, big data analytics, machine learning, review
Procedia PDF Downloads 83238 A Study on Big Data Analytics, Applications, and Challenges
Authors: Chhavi Rana
Abstract:
The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, healthcare, and business intelligence contain voluminous and incremental data which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organisation decision-making strategy can be enhanced by using big data analytics and applying different machine learning techniques and statistical tools to such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates various frameworks in the process of analysis using different machine learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.Keywords: big data, big data analytics, machine learning, review
Procedia PDF Downloads 95237 Genomics of Aquatic Adaptation
Authors: Agostinho Antunes
Abstract:
The completion of the human genome sequencing in 2003 opened a new perspective into the importance of whole genome sequencing projects, and currently multiple species are having their genomes completed sequenced, from simple organisms, such as bacteria, to more complex taxa, such as mammals. This voluminous sequencing data generated across multiple organisms provides also the framework to better understand the genetic makeup of such species and related ones, allowing to explore the genetic changes underlining the evolution of diverse phenotypic traits. Here, recent results from our group retrieved from comparative evolutionary genomic analyses of selected marine animal species will be considered to exemplify how gene novelty and gene enhancement by positive selection might have been determinant in the success of adaptive radiations into diverse habitats and lifestyles.Keywords: comparative genomics, adaptive evolution, bioinformatics, phylogenetics, genome mining
Procedia PDF Downloads 533236 Sewer Culvert Installation Method to Accommodate Underground Construction in an Urban Area with Narrow Streets
Authors: Osamu Igawa, Hiroshi Kouchiwa, Yuji Ito
Abstract:
In recent years, a reconstruction project for sewer pipelines has been progressing in Japan with the aim of renewing old sewer culverts. However, it is difficult to secure a sufficient base area for shafts in an urban area because many streets are narrow with a complex layout. As a result, construction in such urban areas is generally very demanding. In urban areas, there is a strong requirement for a safe, reliable and economical construction method that does not disturb the public’s daily life and urban activities. With this in mind, we developed a new construction method called the 'shield switching type micro-tunneling method' which integrates the micro-tunneling method and shield method. In this method, pipeline is constructed first for sections that are gently curved or straight using the economical micro-tunneling method, and then the method is switched to the shield method for sections with a sharp curve or a series of curves without establishing an intermediate shaft. This paper provides the information, features and construction examples of this newly developed method.Keywords: micro-tunneling method, secondary lining applied RC segment, sharp curve, shield method, switching type
Procedia PDF Downloads 403235 Genomics of Adaptation in the Sea
Authors: Agostinho Antunes
Abstract:
The completion of the human genome sequencing in 2003 opened a new perspective into the importance of whole genome sequencing projects, and currently multiple species are having their genomes completed sequenced, from simple organisms, such as bacteria, to more complex taxa, such as mammals. This voluminous sequencing data generated across multiple organisms provides also the framework to better understand the genetic makeup of such species and related ones, allowing to explore the genetic changes underlining the evolution of diverse phenotypic traits. Here, recent results from our group retrieved from comparative evolutionary genomic analyses of selected marine animal species will be considered to exemplify how gene novelty and gene enhancement by positive selection might have been determinant in the success of adaptive radiations into diverse habitats and lifestyles.Keywords: marine genomics, evolutionary bioinformatics, human genome sequencing, genomic analyses
Procedia PDF Downloads 611234 Protein Tertiary Structure Prediction by a Multiobjective Optimization and Neural Network Approach
Authors: Alexandre Barbosa de Almeida, Telma Woerle de Lima Soares
Abstract:
Protein structure prediction is a challenging task in the bioinformatics field. The biological function of all proteins majorly relies on the shape of their three-dimensional conformational structure, but less than 1% of all known proteins in the world have their structure solved. This work proposes a deep learning model to address this problem, attempting to predict some aspects of the protein conformations. Throughout a process of multiobjective dominance, a recurrent neural network was trained to abstract the particular bias of each individual multiobjective algorithm, generating a heuristic that could be useful to predict some of the relevant aspects of the three-dimensional conformation process formation, known as protein folding.Keywords: Ab initio heuristic modeling, multiobjective optimization, protein structure prediction, recurrent neural network
Procedia PDF Downloads 205233 Effect of the Applied Bias on Mini-Band Structures in Dimer Fibonacci InAs/Ga1-XInXAs Superlattices
Authors: Z. Aziz, S. Terkhi, Y. Sefir, R. Djelti, S. Bentata
Abstract:
The effect of a uniform electric field across multi-barrier systems (InAs/InxGa1-xAs) is exhaustively explored by a computational model using exact Airy function formalism and the transfer-matrix technique. In the case of biased DFHBSL structure a strong reduction in transmission properties was observed and the width of the mini-band structure linearly decreases with the increase of the applied bias. This is due to the confinement of the states in the mini-band structure, which becomes increasingly important (Wannier-Stark Effect).Keywords: dimer fibonacci height barrier superlattices, singular extended state, exact Airy function and transfer matrix formalism, bioinformatics
Procedia PDF Downloads 288232 Deepnic, A Method to Transform Each Variable into Image for Deep Learning
Authors: Nguyen J. M., Lucas G., Brunner M., Ruan S., Antonioli D.
Abstract:
Deep learning based on convolutional neural networks (CNN) is a very powerful technique for classifying information from an image. We propose a new method, DeepNic, to transform each variable of a tabular dataset into an image where each pixel represents a set of conditions that allow the variable to make an error-free prediction. The contrast of each pixel is proportional to its prediction performance and the color of each pixel corresponds to a sub-family of NICs. NICs are probabilities that depend on the number of inputs to each neuron and the range of coefficients of the inputs. Each variable can therefore be expressed as a function of a matrix of 2 vectors corresponding to an image whose pixels express predictive capabilities. Our objective is to transform each variable of tabular data into images into an image that can be analysed by CNNs, unlike other methods which use all the variables to construct an image. We analyse the NIC information of each variable and express it as a function of the number of neurons and the range of coefficients used. The predictive value and the category of the NIC are expressed by the contrast and the color of the pixel. We have developed a pipeline to implement this technology and have successfully applied it to genomic expressions on an Affymetrix chip.Keywords: tabular data, deep learning, perfect trees, NICS
Procedia PDF Downloads 90231 Target and Biomarker Identification Platform to Design New Drugs against Aging and Age-Related Diseases
Authors: Peter Fedichev
Abstract:
We studied fundamental aspects of aging to develop a mathematical model of gene regulatory network. We show that aging manifests itself as an inherent instability of gene network leading to exponential accumulation of regulatory errors with age. To validate our approach we studied age-dependent omic data such as transcriptomes, metabolomes etc. of different model organisms and humans. We build a computational platform based on our model to identify the targets and biomarkers of aging to design new drugs against aging and age-related diseases. As biomarkers of aging, we choose the rate of aging and the biological age since they completely determine the state of the organism. Since rate of aging rapidly changes in response to an external stress, this kind of biomarker can be useful as a tool for quantitative efficacy assessment of drugs, their combinations, dose optimization, chronic toxicity estimate, personalized therapies selection, clinical endpoints achievement (within clinical research), and death risk assessments. According to our model, we propose a method for targets identification for further interventions against aging and age-related diseases. Being a biotech company, we offer a complete pipeline to develop an anti-aging drug-candidate.Keywords: aging, longevity, biomarkers, senescence
Procedia PDF Downloads 274230 New Bio-Strategies for Ochratoxin a Detoxification Using Lactic Acid Bacteria
Authors: José Maria, Vânia Laranjo, Luís Abrunhosa, António Inês
Abstract:
The occurrence of mycotoxigenic moulds such as Aspergillus, Penicillium and Fusarium in food and feed has an important impact on public health, by the appearance of acute and chronic mycotoxicoses in humans and animals, which is more severe in the developing countries due to lack of food security, poverty and malnutrition. This mould contamination also constitutes a major economic problem due the lost of crop production. A great variety of filamentous fungi is able to produce highly toxic secondary metabolites known as mycotoxins. Most of the mycotoxins are carcinogenic, mutagenic, neurotoxic and immunosuppressive, being ochratoxin A (OTA) one of the most important. OTA is toxic to animals and humans, mainly due to its nephrotoxic properties. Several approaches have been developed for decontamination of mycotoxins in foods, such as, prevention of contamination, biodegradation of mycotoxins-containing food and feed with microorganisms or enzymes and inhibition or absorption of mycotoxin content of consumed food into the digestive tract. Some group of Gram-positive bacteria named lactic acid bacteria (LAB) are able to release some molecules that can influence the mould growth, improving the shelf life of many fermented products and reducing health risks due to exposure to mycotoxins. Some LAB are capable of mycotoxin detoxification. Recently our group was the first to describe the ability of LAB strains to biodegrade OTA, more specifically, Pediococcus parvulus strains isolated from Douro wines. The pathway of this biodegradation was identified previously in other microorganisms. OTA can be degraded through the hydrolysis of the amide bond that links the L-β-phenylalanine molecule to the ochratoxin alpha (OTα) a non toxic compound. It is known that some peptidases from different origins can mediate the hydrolysis reaction like, carboxypeptidase A an enzyme from the bovine pancreas, a commercial lipase and several commercial proteases. So, we wanted to have a better understanding of this OTA degradation process when LAB are involved and identify which molecules where present in this process. For achieving our aim we used some bioinformatics tools (BLAST, CLUSTALX2, CLC Sequence Viewer 7, Finch TV). We also designed specific primers and realized gene specific PCR. The template DNA used came from LAB strains samples of our previous work, and other DNA LAB strains isolated from elderberry fruit, silage, milk and sausages. Through the employment of bioinformatics tools it was possible to identify several proteins belonging to the carboxypeptidase family that participate in the process of OTA degradation, such as serine type D-Ala-D-Ala carboxypeptidase and membrane carboxypeptidase. In conclusions, this work has identified carboxypeptidase proteins being one of the molecules present in the OTA degradation process when LAB are involved.Keywords: carboxypeptidase, lactic acid bacteria, mycotoxins, ochratoxin a.
Procedia PDF Downloads 462229 Domain Adaptation Save Lives - Drowning Detection in Swimming Pool Scene Based on YOLOV8 Improved by Gaussian Poisson Generative Adversarial Network Augmentation
Authors: Simiao Ren, En Wei
Abstract:
Drowning is a significant safety issue worldwide, and a robust computer vision-based alert system can easily prevent such tragedies in swimming pools. However, due to domain shift caused by the visual gap (potentially due to lighting, indoor scene change, pool floor color etc.) between the training swimming pool and the test swimming pool, the robustness of such algorithms has been questionable. The annotation cost for labeling each new swimming pool is too expensive for mass adoption of such a technique. To address this issue, we propose a domain-aware data augmentation pipeline based on Gaussian Poisson Generative Adversarial Network (GP-GAN). Combined with YOLOv8, we demonstrate that such a domain adaptation technique can significantly improve the model performance (from 0.24 mAP to 0.82 mAP) on new test scenes. As the augmentation method only require background imagery from the new domain (no annotation needed), we believe this is a promising, practical route for preventing swimming pool drowning.Keywords: computer vision, deep learning, YOLOv8, detection, swimming pool, drowning, domain adaptation, generative adversarial network, GAN, GP-GAN
Procedia PDF Downloads 101228 An Accurate Brain Tumor Segmentation for High Graded Glioma Using Deep Learning
Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan
Abstract:
Gliomas are most challenging and aggressive type of tumors which appear in different sizes, locations, and scattered boundaries. CNN is most efficient deep learning approach with outstanding capability of solving image analysis problems. A fully automatic deep learning based 2D-CNN model for brain tumor segmentation is presented in this paper. We used small convolution filters (3 x 3) to make architecture deeper. We increased convolutional layers for efficient learning of complex features from large dataset. We achieved better results by pushing convolutional layers up to 16 layers for HGG model. We achieved reliable and accurate results through fine-tuning among dataset and hyper-parameters. Pre-processing of this model includes generation of brain pipeline, intensity normalization, bias correction and data augmentation. We used the BRATS-2015, and Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.81 for complete, 0.79 for core, 0.80 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.Keywords: brain tumor segmentation, convolutional neural networks, deep learning, HGG
Procedia PDF Downloads 256227 Aspects Regarding the Structural Behaviour of Autonomous Underwater Vehicle for Emergency Response
Authors: Lucian Stefanita Grigore, Damian Gorgoteanu, Cristian Molder, Amado Stefan, Daniel Constantin
Abstract:
The purpose of this article is to present an analytical-numerical study on the structural behavior of a sunken autonomous underwater vehicle (AUV) for emergency intervention. The need for such a study was generated by the key objective of the ERL-Emergency project. The project aims to develop a system of collaborative robots for emergency response. The system consists of two robots: unmanned ground vehicles (UGV) on tracks and the second is an AUV. The system of collaborative robots, AUV and UGV, will be used to perform missions of monitoring, intervention, and rescue. The main mission of the AUV is to dive into the maritime space of an industrial port to detect possible leaks in a pipeline transporting petroleum products. Another mission is to close and open the valves with which the pipes are provided. Finally, you will need to be able to lift a manikin to the surface, which you can take to land. Numerical analysis was performed by the finite element method (FEM). The conditions for immersing the AUV at 100 m depth were simulated, and the calculations for different fluid flow rates were repeated. From a structural point of view, the stiffening areas and the enclosures in which the command-and-control elements and the accumulators are located have been especially analyzed. The conclusion of this research is that the AUV meets very well the established requirements.Keywords: analytical-numerical, emergency, FEM, robotics, underwater
Procedia PDF Downloads 150226 Multi-Environment Quantitative Trait Loci Mapping for Grain Iron and Zinc Content Using Bi-Parental Recombinant Inbred Lines in Pearl Millet
Authors: Tripti Singhal, C. Tara Satyavathi, S. P. Singh, Aruna Kumar, Mukesh Sankar S., C. Bhardwaj, Mallik M., Jayant Bhat, N. Anuradha, Nirupma Singh
Abstract:
Pearl millet is a climate-resilient nutritious crop. We report iron and zinc content QTLs from 3 divergent locations. The content of grain Fe in the RILs ranged between 36 and 114 mg/kg, and that of Zn from 20 to 106 mg/kg across the three years at over 3 locations (Delhi, Dharwad, and Jodhpur). We used SSRs to generate a linkage map using 210 F₆ RIL derived from the (PPMI 683 × PPMI 627) cross. The linkage map of 151 loci was 3403.6 cM in length. QTL analysis revealed a total of 22 QTLs for both traits at all locations. Inside QTLs, candidate genes were identified using bioinformatics approaches.Keywords: yield, pearl millet, QTL mapping, multi-environment, RILs
Procedia PDF Downloads 140225 Control of Pipeline Gas Quality to Extend Gas Turbine Life
Authors: Peter J. H. Carnell, Panayiotis Theophanous
Abstract:
Natural gas due to its cleaner combustion characteristics is expected to be the most widely used fuel in the move towards less polluting and renewable energy sources. Thus, the developed world is supplied by a complex network of gas pipelines and natural gas is becoming a major source of fuel. Natural gas delivered directly from the well will differ in composition from gas derived from LNG or produced by anaerobic digestion processes. Each will also have specific contaminants and properties although gas from all sources is likely to enter the distribution system and be blended to provide the desired characteristics such as Higher Heating Value and Wobbe No. The absence of a standard gas composition poses problems when the gas is used as a chemical feedstock, in specialised furnaces or on gas turbines. The chemical industry has suffered in the past as a result of variable gas composition. Transition metal catalysts used in ammonia, methanol and hydrogen plants were easily poisoned by sulphur, chlorides and mercury reducing both activity and catalyst expected lives from years to months. These plants now concentrate on purification and conditioning of the natural gas feed using fixed bed technologies, allowing them to run for several years and having transformed their operations. Similar technologies can be applied to the power industry reducing maintenance requirements and extending the operating life of gas turbines.Keywords: gas composition, gas conditioning, gas turbines, power generation, purification
Procedia PDF Downloads 286224 FLIME - Fast Low Light Image Enhancement for Real-Time Video
Authors: Vinay P., Srinivas K. S.
Abstract:
Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.Keywords: low light image enhancement, real-time video, computer vision, machine learning
Procedia PDF Downloads 205223 Production Sharing Contracts Transparency Simulation
Authors: Chariton Christou, David Cornwell
Abstract:
Production Sharing Contract (PSC) is the type of contract that is being used widely in our time. The financial crisis made the governments tightfisted and they do not have the resources to participate in a development of a field. Therefore, more and more countries introduce the PSC. The companies have the power and the money to develop the field with their own way. The main problem is the transparency of oil and gas companies especially in the PSC and how this can be achieved. Many discussions have been made especially in the U.K. What we are suggesting is a dynamic financial simulation with the help of a flow meter. The flow meter will count the production of each field every day (it will be installed in a pipeline). The production will be the basic input of the simulation. It will count the profit, the costs and more according to the information of the flow meter. In addition it will include the terms of the contract and the costs that have been paid. By all these parameters the simulation will be able to present in real time the information of a field (taxes, employees, R-factor). By this simulation the company will share some information with the government but not all of them. The government will know the taxes that should be paid and what is the sharing percentage of it. All of the other information could be confidential for the company. Furthermore, oil company could control the R-factor by changing the production each day to maximize its sharing percentages and as a result of this the profit. This idea aims to change the way that governments 'control' oil companies and bring a transparency evolution in the industry. With the help of a simulation every country could be next to the company and have a better collaboration.Keywords: production sharing contracts, transparency, simulation
Procedia PDF Downloads 375222 Field Deployment of Corrosion Inhibitor Developed for Sour Oil and Gas Carbon Steel Pipelines
Authors: Jeremy Moloney
Abstract:
A major oil and gas operator in western Canada producing approximately 50,000 BOE per day of sour fluids was experiencing increased water production along with decreased oil production over several years. The higher water volumes being produced meant an increase in the operator’s incumbent corrosion inhibitor (CI) chemical requirements but with reduced oil production revenues. Thus, a cost-effective corrosion inhibitor solution was sought to deliver enhanced corrosion mitigation of the carbon steel pipeline infrastructure but at reduced chemical injection dose rates. This paper presents the laboratory work conducted on the development of a corrosion inhibitor under the operator’s simulated sour operating conditions and then subsequent field testing of the product. The new CI not only provided extremely good levels of general and localized corrosion inhibition and outperformed the incumbent CI under the laboratory test conditions but did so at vastly lower concentrations. In turn, the novel CI product facilitated field chemical injection rates to be optimized and reduced by 40% compared with the incumbent whilst maintaining superior corrosion protection resulting in significant cost savings and associated sustainability benefits for the operator.Keywords: carbon steel, sour gas, hydrogen sulphide, localized corrosion, pitting, corrosion inhibitor
Procedia PDF Downloads 85221 Investigation of Optimal Parameter Settings in Super Duplex Stainless Steel Welding Welding
Authors: R. M. Chandima Ratnayake, Daniel Dyakov
Abstract:
Super steel materials play vital role in construction and fabrication of structural, piping and pipeline components. They enable to minimize the life cycle costs in assuring the integrity of onshore and offshore operating systems. In this context, Duplex stainless steel (DSS) material related welding on constructions and fabrications play a significant role in maintaining and assuring integrity at an optimal expenditure over the life cycle of production and process systems as well as associated structures. In DSS welding, the factors such as gap geometry, shielding gas supply rate, welding current, and type of the welding process play a vital role on the final joint performance. Hence, an experimental investigation has been performed using engineering robust design approach (ERDA) to investigate the optimal settings that generate optimal super DSS (i.e. UNS S32750) joint performance. This manuscript illustrates the mathematical approach and experimental design, optimal parameter settings and results of verification experiment.Keywords: duplex stainless steel welding, engineering robust design, mathematical framework, optimal parameter settings
Procedia PDF Downloads 415220 Non-Centrifugal Cane Sugar Production: Heat Transfer Study to Optimize the Use of Energy
Authors: Fabian Velasquez, John Espitia, Henry Hernadez, Sebastian Escobar, Jader Rodriguez
Abstract:
Non-centrifuged cane sugar (NCS) is a concentrated product obtained through the evaporation of water contain from sugarcane juice inopen heat exchangers (OE). The heat supplied to the evaporation stages is obtained from the cane bagasse through the thermochemical process of combustion, where the thermal energy released is transferred to OE by the flue gas. Therefore, the optimization of energy usage becomes essential for the proper design of the production process. For optimize the energy use, it is necessary modeling and simulation of heat transfer between the combustion gases and the juice and to understand the major mechanisms involved in the heat transfer. The main objective of this work was simulated heat transfer phenomena between the flue gas and open heat exchangers using Computational Fluid Dynamics model (CFD). The simulation results were compared to field measured data. Numerical results about temperature profile along the flue gas pipeline at the measurement points are in good accordance with field measurements. Thus, this study could be of special interest in design NCS production process and the optimization of the use of energy.Keywords: mathematical modeling, design variables, computational fluid dynamics, overall thermal efficiency
Procedia PDF Downloads 125219 Comprehensive Critical Review for Static and Dynamic Soil-Structure Interaction Between Winkler, Pasternak and Three-Dimensional Method of Buried Pipelines
Authors: N. E.Sam, S. R.Singh
Abstract:
Pipeline infrastructure are a valuable asset to the country that help in transporting fluid and gas from one place to another and contribute in keeping the country functioning both physically and economically. During seismic activity, additional loads are acted on the buried pipelines becoming a salient parameter to be studied in soil pipe interaction. Winkler Beam Theory is a commonly used approach for design of underground buried structures however this theory does not take into account shear and dynamic loading parameters in consideration. Shear can be addressed in Pasternak Theory – an improved model of Winkler Theory. However dynamic loading condition and horizontal displacement is not considered in either method. A comprehensive critical review between Winkler Beam Method, Pasternak Method and Three-Dimensional Method in finite element analysis is to be done in this paper for seismic forces. Study of the influence of depth and displacement of soil in correspondence to stiffness value and influence of horizontal displacement for design of underground structures is considered.Keywords: finite element, pasternak theory, seismic, soil-structure interaction, three-dimensional theory, winkler theory
Procedia PDF Downloads 74218 Numerical Simulation of Natural Gas Dispersion from Low Pressure Pipelines
Authors: Omid Adibi, Nategheh Najafpour, Bijan Farhanieh, Hossein Afshin
Abstract:
Gas release from the pipelines is one of the main factors in the gas industry accidents. Released gas ejects from the pipeline as a free jet and in the growth process, the fuel gets mixed with the ambient air. Accordingly, an accidental spark will release the chemical energy of the mixture with an explosion. Gas explosion damages the equipment and endangers the life of staffs. So due to importance of safety in gas industries, prevision of accident can reduce the number of the casualties. In this paper, natural gas leakages from the low pressure pipelines are studied in two steps: 1) the simulation of mixing process and identification of flammable zones and 2) the simulation of wind effects on the mixing process. The numerical simulations were performed by using the finite volume method and the pressure-based algorithm. Also, for the grid generation the structured method was used. The results show that, in just 6.4 s after accident, released natural gas could penetrate to 40 m in vertical and 20 m in horizontal direction. Moreover, the results show that the wind speed is a key factor in dispersion process. In fact, the wind transports the flammable zones into the downstream. Hence, to improve the safety of the people and human property, it is preferable to construct gas facilities and buildings in the opposite side of prevailing wind direction.Keywords: flammable zones, gas pipelines, numerical simulation, wind effects
Procedia PDF Downloads 166217 Rethinking Nigeria's Foreign Policy in the Age of Global Terrorism
Authors: Shuaibu Umar Abdul
Abstract:
This paper examines Nigeria’s foreign policy in the age of global terrorism. It worth saying that the threat of ‘terrorism’ is not peculiar to Western and Middle Eastern countries alone, its tentacles are now spreading all over, Africa inclusive. The issue of domestic terrorism in Nigeria has become pervasive since the return of democratic rule in 1999. This development has never been a witness in any form throughout the year of statehood in Nigeria, the issues of banditry, armed robbery, ritual killing, and criminal activities like kidnapping and pipeline vandalization, the breakdown of law and order, poorly managed infrastructural facilities and corruption remain synonymous to Nigeria. These acts of terrorism no doubt have constituted a challenge that necessitates the paradigm shift in Nigeria’s foreign policy. The study employed the conceptual framework of analysis to lead interrogation; secondary sources were used to generate data while descriptive and content analysis were considered for data presentation and interpretation. In view of the interrogation and discussion on the subject matter, the paper revealed that Nigerian government underrated and underestimated the strength of terrorism within and outside her policy hence, it becomes difficult to address. As a response to the findings and conclusion of the study, the paper recommends among others that Nigeria’s foreign policy has to be rethought, reshaped and remodeled in cognizance to the rising global terrorism for peace, growth and development in the country.Keywords: foreign policy, globe, Nigeria, rethinking, terrorism
Procedia PDF Downloads 358216 Results of the Field-and-Scientific Study in the Water Area of the Estuaries of the Major Rivers of the Black Sea and Sea Ports on the Territory of Georgia
Authors: Ana Gavardashvili
Abstract:
The field-and-scientific studies to evaluate the modern ecological state in the water area of the estuaries of the major water-abundant rivers in the coastal line of the Black Sea (Chorokhi, Kintrishi, Natanebi, Supsa, Khobistskali, Rioni and Enguri) and sea ports (Batumi, Poti) and sea terminals of the oil pipeline (Baku-Tbilisi-Supsa, Kulevi) were accomplished in the months of June and July of 2015. GPS coordinates and GIS programs were used to fix the areas of the estuaries of the above-listed rivers on a digital map, with their values varying within the limits of 0,861 and 20,390 km2. Water samples from the Black Sea were taken from the river estuaries and sea ports during the field works, with their statistical series of 125 points. The temperatures of air (t2) and water in the Black Sea (t1) were measured locally, and their relative value is (t1 /t2 ) = 0,69 – 0,92. 125 water samples taken from the study object in the Black Sea coastal line were subject to laboratory analysis, and it was established that the Black Sea acidity (pH) changes within the limits of 7,71 – 8,22 in the river estuaries and within 8,42 - 8,65 in the port water areas and at oil terminals. As for the Sea water salinity index (TDS), it changes within the limits of 6,15 – 12,67 in the river estuaries, and (TDS) = 11,80 – 13,67 in the port water areas and at oil terminals. By taking the gained data and climatic changes into account, by using the theories of reliability and risk at the following stage, the nature of the changes of the function of the Black Sea ecological parameters will be established.Keywords: acidity, estuary, salinity, sea
Procedia PDF Downloads 288215 Development of Numerical Model to Compute Water Hammer Transients in Pipe Flow
Authors: Jae-Young Lee, Woo-Young Jung, Myeong-Jun Nam
Abstract:
Water hammer is a hydraulic transient problem which is commonly encountered in the penstocks of hydropower plants. The numerical model was developed to estimate the transient behavior of pressure waves in pipe systems. The computational algorithm was proposed to model the water hammer phenomenon in a pipe system with pump shutdown at midstream and sudden valve closure at downstream. To predict the pressure head and flow velocity as a function of time as a result of rapidly closing a valve and pump shutdown, two boundary conditions at the ends considering pump operation and valve control can be implemented as specified equations of the pressure head and flow velocity based on the characteristics method. It was shown that the effects of transient flow make it determine the needs for protection devices, such as surge tanks, surge relief valves, or air valves, at various points in the system against overpressure and low pressure. It produced reasonably good performance with the results of the proposed transient model for pipeline systems. The proposed numerical model can be used as an efficient tool for the safety assessment of hydropower plants due to water hammer.Keywords: water hammer, hydraulic transient, pipe systems, characteristics method
Procedia PDF Downloads 136214 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices
Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner
Abstract:
Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.Keywords: biometrics, electrocardiographic, machine learning, signals processing
Procedia PDF Downloads 142