Search results for: data recovery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26254

Search results for: data recovery

25474 Data Integration with Geographic Information System Tools for Rural Environmental Monitoring

Authors: Tamas Jancso, Andrea Podor, Eva Nagyne Hajnal, Peter Udvardy, Gabor Nagy, Attila Varga, Meng Qingyan

Abstract:

The paper deals with the conditions and circumstances of integration of remotely sensed data for rural environmental monitoring purposes. The main task is to make decisions during the integration process when we have data sources with different resolution, location, spectral channels, and dimension. In order to have exact knowledge about the integration and data fusion possibilities, it is necessary to know the properties (metadata) that characterize the data. The paper explains the joining of these data sources using their attribute data through a sample project. The resulted product will be used for rural environmental analysis.

Keywords: remote sensing, GIS, metadata, integration, environmental analysis

Procedia PDF Downloads 113
25473 Exploring Salient Shifts and Transdiagnostic Factors in Eating Disordered Women

Authors: Francesca Favero, Despina Learmonth

Abstract:

Carbohydrate addiction is said to be the sustained dependence on hyperpalatable foods rich in carbohydrates and sugar. This addiction manifests in increased consumption of carbohydrates through binging: a behaviour typically associated with eating disorders. There is a lack of consensus amongst relevant experts as to whether carbohydrates are physiologically or psychologically addictive. With an increased focus on carbohydrate addiction, an outpatient treatment programme, HELP, has been established in Cape Town, South Africa, to specifically address this issue. This research aimed to explore, pre-and post-intervention, the possible presence of, and subsequent shifts in, the maintaining mechanisms identified in the transdiagnostic model for eating disorders. However, the potential for the emergence of other perpetuating factors was not discounted and the nature of the analysis allowed for this possibility. Eight women between the ages of twenty-two and fifty, who had completed the outpatient treatment programme in the last six months, were interviewed. They were asked to speak retrospectively about their personal difficulties, eating and food, and their experience of the treatment. Thematic analysis was employed to identify themes arising from the data. Five themes congruent with the transdiagnostic model’s factors emerged: over-evaluation of weight and shape, core low self-esteem, interpersonal difficulties, clinical perfectionism and mood intolerance. A variety of sub-themes, elaborating upon the various ways in which the disordered eating was maintained, also emerged from the data. Shifts in these maintaining mechanisms were identified. Although not necessarily indicative of recovery, the results suggest that the outpatient HELP programme had a positive overall influence on the participants; and that the transdiagnostic model may be useful in understanding and guiding the treatment of clients who engage in this type of treatment programme.

Keywords: eating disorders, binge eating disorder, carbohydrate addiction, transdiagnostic model, maintaining mechanisms, thematic analysis, outpatient treatment

Procedia PDF Downloads 315
25472 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic

Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi

Abstract:

In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.

Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing

Procedia PDF Downloads 296
25471 Forthcoming Big Data on Smart Buildings and Cities: An Experimental Study on Correlations among Urban Data

Authors: Yu-Mi Song, Sung-Ah Kim, Dongyoun Shin

Abstract:

Cities are complex systems of diverse and inter-tangled activities. These activities and their complex interrelationships create diverse urban phenomena. And such urban phenomena have considerable influences on the lives of citizens. This research aimed to develop a method to reveal the causes and effects among diverse urban elements in order to enable better understanding of urban activities and, therefrom, to make better urban planning strategies. Specifically, this study was conducted to solve a data-recommendation problem found on a Korean public data homepage. First, a correlation analysis was conducted to find the correlations among random urban data. Then, based on the results of that correlation analysis, the weighted data network of each urban data was provided to people. It is expected that the weights of urban data thereby obtained will provide us with insights into cities and show us how diverse urban activities influence each other and induce feedback.

Keywords: big data, machine learning, ontology model, urban data model

Procedia PDF Downloads 412
25470 Cfd Simulation for Urban Environment for Evaluation of a Wind Energy Potential of a Building or a New Urban Planning

Authors: David Serero, Loic Couton, Jean-Denis Parisse, Robert Leroy

Abstract:

This paper presents an analysis method of airflow at the periphery of several typologies of architectural volumes. To understand the complexity of the urban environment on the airflows in the city, we compared three sites at different architectural scale. The research sets a method to identify the optimal location for the installation of wind turbines on the edges of a building and to achieve an improvement in the performance of energy extracted by precise localization of an accelerating wing called “aero foil”. The objective is to define principles for the installation of wind turbines and natural ventilation design of buildings. Instead of theoretical winds analysis, we combined numerical aeraulic simulations using STAR CCM + software with wind data, over long periods of time (greater than 1 year). If airflows computer fluid analysis (CFD) simulation of buildings are current, we have calibrated a virtual wind tunnel with wind data using in situ anemometers (to establish localized cartography of urban winds). We can then develop a complete volumetric model of the behavior of the wind on a roof area, or an entire urban island. With this method, we can categorize: - the different types of wind in urban areas and identify the minimum and maximum wind spectrum, - select the type of harvesting devices - fixing to the roof of a building, - the altimetry of the device in relation to the levels of the roofs - The potential nuisances around. This study is carried out from the recovery of a geolocated data flow, and the connection of this information with the technical specifications of wind turbines, their energy performance and their speed of engagement. Thanks to this method, we can thus define the characteristics of wind turbines to maximize their performance in urban sites and in a turbulent airflow regime. We also study the installation of a wind accelerator associated with buildings. The “aerofoils which are integrated are improvement to control the speed of the air, to orientate it on the wind turbine, to accelerate it and to hide, thanks to its profile, the device on the roof of the building.

Keywords: wind energy harvesting, wind turbine selection, urban wind potential analysis, CFD simulation for architectural design

Procedia PDF Downloads 145
25469 Data-driven Decision-Making in Digital Entrepreneurship

Authors: Abeba Nigussie Turi, Xiangming Samuel Li

Abstract:

Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.

Keywords: startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship

Procedia PDF Downloads 321
25468 Valorization of Sargassum: Use of Twin-Screw Extrusion to Produce Biomolecules and Biomaterials

Authors: Bauta J., Raynaud C., Vaca-Medina G., Simon V., Roully A., Vandenbossche V.

Abstract:

Sargassum is a brown algae, originally found in the Sargasso Sea, located in the Caribbean region and the Gulf of Mexico. The flow of Sargassum is becoming a critical environmental problem all over the Caribbean islands particularly. In Guadeloupe alone, around 80,000 tons of seaweed are stranded during the season. Since the appearance of the first waves of Sargassum algae, several measures have been taken to collect them to keep the beaches clean. Nevertheless, 90% of the collected algae are currently stored without recovery. The lack of research initiative demands a more in-depth exploration of Sargassum algae chemistry, targeted towards added value applications and their development. In this context, the aim of the study was to develop a biorefinery process to valorize Sargassum as a source of bioactive natural substances and as raw material to produce biomaterials simultaneously. The technology used was the twin-screw extrusion, which allows to achieve continuously in the same machine different unit fractionation operations. After the identification of the molecules of interest in Sargassum algae, different operating conditions of thermo-mechanical treatment were applied in a twin-screw extruder. The nature of the solvent, the configuration of the extruder, the screw profile, and the temperature profile were studied in order to fractionate the algal biomass and to allow the recovery of a bioactive liquid fraction of interest and a solid residue suitable for the production of biomaterials. Each bioactive liquid fraction was characterized and strategic ways of adding value were proposed. In parallel, the possibility of using the solid residue to produce biomaterials was studied by setting up Dynamic Vapour Sorption (DVS) and basic Pressure-Volume-Temperature (PVT) analyses. The solid residue was molded by compression cooking. The obtained materials were finally characterized mechanically. The results obtained were very comforting and gave some perspectives to find an interesting valorization for the Sargassum algae.

Keywords: seaweeds, twin-screw extrusion, fractionation, bioactive compounds, biomaterials, biomass

Procedia PDF Downloads 122
25467 Gas Lift Optimization to Improve Well Performance

Authors: Mohamed A. G. H. Abdalsadig, Amir Nourian, G. G. Nasr, Meisam Babaie

Abstract:

Gas lift optimization is becoming more important now a day in petroleum industry. A proper lift optimization can reduce the operating cost, increase the net present value (NPV) and maximize the recovery from the asset. A widely accepted definition of gas lift optimization is to obtain the maximum output under specified operating conditions. In addition, gas lift, a costly and indispensable means to recover oil from high depth reservoir entails solving the gas lift optimization problems. Gas lift optimization is a continuous process; there are two levels of production optimization. The total field optimization involves optimizing the surface facilities and the injection rate that can be achieved by standard tools softwares. Well level optimization can be achieved by optimizing the well parameters such as point of injection, injection rate, and injection pressure. All these aspects have been investigated and presented in this study by using experimental data and PROSPER simulation program. The results show that the well head pressure has a large influence on the gas lift performance and also proved that smart gas lift valve can be used to improve gas lift performance by controlling gas injection from down hole. Obtaining the optimum gas injection rate is important because excessive gas injection reduces production rate and consequently increases the operation cost.

Keywords: optimization, production rate, reservoir pressure effect, gas injection rate effect, gas injection pressure

Procedia PDF Downloads 408
25466 Reverse Logistics End of Life Products Acquisition and Sorting

Authors: Badli Shah Mohd Yusoff, Khairur Rijal Jamaludin, Rozetta Dollah

Abstract:

The emerging of reverse logistics and product recovery management is an important concept in reconciling economic and environmental objectives through recapturing values of the end of life product returns. End of life products contains valuable modules, parts, residues and materials that can create value if recovered efficiently. The main objective of this study is to explore and develop a model to recover as much of the economic value as reasonably possible to find the optimality of return acquisition and sorting to meet demand and maximize profits over time. In this study, the benefits that can be obtained for remanufacturer is to develop demand forecasting of used products in the future with uncertainty of returns and quality of products. Formulated based on a generic disassembly tree, the proposed model focused on three reverse logistics activity, namely refurbish, remanufacture and disposal incorporating all plausible means quality levels of the returns. While stricter sorting policy, constitute to the decrease amount of products to be refurbished or remanufactured and increases the level of discarded products. Numerical experiments carried out to investigate the characteristics and behaviour of the proposed model with mathematical programming model using Lingo 16.0 for medium-term planning of return acquisition, disassembly (refurbish or remanufacture) and disposal activities. Moreover, the model seeks an analysis a number of decisions relating to trade off management system to maximize revenue from the collection of use products reverse logistics services through refurbish and remanufacture recovery options. The results showed that full utilization in the sorting process leads the system to obtain less quantity from acquisition with minimal overall cost. Further, sensitivity analysis provides a range of possible scenarios to consider in optimizing the overall cost of refurbished and remanufactured products.

Keywords: core acquisition, end of life, reverse logistics, quality uncertainty

Procedia PDF Downloads 299
25465 Study Mercapto-Nanoscavenger as a Promising Analytical Tool

Authors: Mohammed M. Algaradah

Abstract:

A chelating mercapto- nanoscavenger has been developed exploiting the high surface area of monodisperse nano-sized mesoporous silica. The nanoscavenger acts as a solid phase trace metal extractant whilst suspended as a quasi-stable sol in aqueous samples. This mode of extraction requires no external agitation as the particles move naturally through the sample by Brownian motion, convection and slow sedimentation. Careful size selection enables the nanoscavenger to be easily recovered together with the extracted analyte by conventional filtration or centrifugation. The research describes the successful attachment of chelator mercapto to ca. 136 ± 15 nm high surface area (BET surface area = 1006 m2 g-1) mesoporous silica particles. The resulting material had a copper capacity of ca. 1.34 ± 0.10 mmol g-1 and was successfully applied to the collection of a trace element from water. Essentially complete recovery of Cu (II) has been achieved from freshwater samples giving typical preconcentration factors of 100 from 50 µg/l samples. Data obtained from a nanoscavenger-based extraction of copper from samples were not significantly different from those obtained by using a conventional colorimetric procedure employing complexation/solvent extraction.

Keywords: nano scavenger, mesoporous silica, trace metal, preconcentration

Procedia PDF Downloads 81
25464 Paleopalynology as an Analysis Tool to Measure the Resilience of the Ecosystems of the Western Mediterranean and Their Adaptation to Climate Change

Authors: F. Ismael Roman Moreno, Francisca Alba Sanchez

Abstract:

Over time, the plant landscape has changed as a result of the numerous events on a global and local scale that have happened. This is the case of the Mediterranean ecosystems, one of the most complex and rich in endemisms on the planet, subjected to anthropic pressures from the beginning of civilizations. The intervention in these systems together with climate changes has led to changes in diversity, tree cover, shrub, and ultimately in the structure and functioning of these ecosystems. Paleopalinology is used as a tool for analysis of pollen and non-pollen microfossils preserved in the flooded grasslands of the Middle Atlas (Morocco). This allows reconstructing the evolution of vegetation and climate, as well as providing data and reasoning to different ecological, cultural and historical processes. Although climatic and anthropic events are well documented in Europe, they are not so well documented in North Africa, which gives added value to the study area. The results obtained serve to predict the behavior and evolution of Mediterranean mountain ecosystems during the Holocene, their response to future changes, resilience, and recovery from climatic and anthropic disturbances. In the stratigraphic series analyzed, nine major events were detected, eight of which appeared to be of climatic and anthropic origin, and one unexpected, related to volcanic activity.

Keywords: anthropic, Holocene, Morocco, paleopalynology, resilience

Procedia PDF Downloads 156
25463 Municipal Action Against Urbanisation-Induced Warming: Case Studies from Jordan, Zambia, and Germany

Authors: Muna Shalan

Abstract:

Climate change is a systemic challenge for cities, with its impacts not happening in isolation but rather intertwined, thus increasing hazards and the vulnerability of the exposed population. The increase in the frequency and intensity of heat waves, for example, is associated with multiple repercussions on the quality of life of city inhabitants, including health discomfort, a rise in mortality and morbidity, increasing energy demand for cooling, and shrinking of green areas due to drought. To address the multi-faceted impact of urbanisation-induced warming, municipalities and local governments are challenged with devising strategies and implementing effective response measures. Municipalities are recognising the importance of guiding urban concepts to drive climate action in the urban environment. An example is climate proofing, which refers to a process of mainstreaming climate change into development strategies and programs, i.e., urban planning is viewed through a climate change lens. There is a multitude of interconnected aspects that are critical to paving the path toward climate-proofing of urban areas and avoiding poor planning of layouts and spatial arrangements. Navigating these aspects through an analysis of the overarching practices governing municipal planning processes, which is the focus of this research, will highlight entry points to improve procedures, methods, and data availability for optimising planning processes and municipal actions. By employing a case study approach, the research investigates how municipalities in different contexts, namely in the city of Sahab in Jordan, Chililabombwe in Zambia, and the city of Dortmund in Germany, are integrating guiding urban concepts to shrink the deficit in adaptation and mitigation and achieve climate proofing goals in their respective local contexts. The analysis revealed municipal strategies and measures undertaken to optimize existing building and urban design regulations by introducing key performance indicators and improving in-house capacity. Furthermore, the analysis revealed that establishing or optimising interdepartmental communication frameworks or platforms is key to strengthening the steering structures governing local climate action. The most common challenge faced by municipalities is related to their role as a regulator and implementers, particularly in budget analysis and instruments for cost recovery of climate action measures. By leading organisational changes related to improving procedures and methods, municipalities can mitigate the various challenges that may emanate from uncoordinated planning and thus promote action against urbanisation-induced warming.

Keywords: urbanisation-induced warming, response measures, municipal planning processes, key performance indicators, interdepartmental communication frameworks, cost recovery

Procedia PDF Downloads 65
25462 Cryptographic Protocol for Secure Cloud Storage

Authors: Luvisa Kusuma, Panji Yudha Prakasa

Abstract:

Cloud storage, as a subservice of infrastructure as a service (IaaS) in Cloud Computing, is the model of nerworked storage where data can be stored in server. In this paper, we propose a secure cloud storage system consisting of two main components; client as a user who uses the cloud storage service and server who provides the cloud storage service. In this system, we propose the protocol schemes to guarantee against security attacks in the data transmission. The protocols are login protocol, upload data protocol, download protocol, and push data protocol, which implement hybrid cryptographic mechanism based on data encryption before it is sent to the cloud, so cloud storage provider does not know the user's data and cannot analysis user’s data, because there is no correspondence between data and user.

Keywords: cloud storage, security, cryptographic protocol, artificial intelligence

Procedia PDF Downloads 351
25461 Decentralized Data Marketplace Framework Using Blockchain-Based Smart Contract

Authors: Meshari Aljohani, Stephan Olariu, Ravi Mukkamala

Abstract:

Data is essential for enhancing the quality of life. Its value creates chances for users to profit from data sales and purchases. Users in data marketplaces, however, must share and trade data in a secure and trusted environment while maintaining their privacy. The first main contribution of this paper is to identify enabling technologies and challenges facing the development of decentralized data marketplaces. The second main contribution is to propose a decentralized data marketplace framework based on blockchain technology. The proposed framework enables sellers and buyers to transact with more confidence. Using a security deposit, the system implements a unique approach for enforcing honesty in data exchange among anonymous individuals. Before the transaction is considered complete, the system has a time frame. As a result, users can submit disputes to the arbitrators which will review them and respond with their decision. Use cases are presented to demonstrate how these technologies help data marketplaces handle issues and challenges.

Keywords: blockchain, data, data marketplace, smart contract, reputation system

Procedia PDF Downloads 154
25460 Effects of Using Clinical Guidelines for Feeding through a Gastrostomy Tube in Critically ill Surgical Patients Songkla Hospital Thailand

Authors: Siriporn Sikkaphun

Abstract:

Food is essential for living, and receiving correct, suitable, and adequate food is advantageous to the body, especially for patients because it can enable good recovery. Feeding through a gastrostomy tube is one useful way that is widely used because it is easy, convenient, and economical.To compare the effectiveness of using the clinical guidelines for feeding through a gastrostomy tube in critically ill surgical patients.This is a pre-post quasi-experimental study on 15 critically ill surgical or accident patients who needed intubation and the gastrostomy tube from August 2011 to November 2012. The data were collected using the guidelines, and an evaluation form for effectiveness of guidelines for feeding through a gastrostomy tube in critically ill surgical patients. After using the guidelines for feeding through a gastrostomy tube in critically ill surgical patients, it was found that The average number of days from the admission date to the day the patients received food through the G-tube significantly reduced at the level .05. The number of personnel who practiced nursing activities correctly and suitably for patients with complications during feeding significantly increased at the level .05.The number of patients receiving energy to the target level significantly increased at the level .05. The results of this study indicated that the use of the guidelines for feeding through a gastrostomy tube in critically ill surgical patients was feasible in practice, and the outcomes were beneficial to the patients.

Keywords: clinical guidelines, feeding, gastrostomy tube, critically ill, surgical patients

Procedia PDF Downloads 320
25459 Internal Power Recovery in Cryogenic Cooling Plants, Part II: Compressor Development

Authors: Ambra Giovannelli, Erika Maria Archilei

Abstract:

The electrical power consumption related to refrigeration systems is evaluated to be in the order of 15% of the total electricity consumption worldwide. For this reason, in the last years several energy saving techniques have been suggested to reduce the power demand of refrigeration and air conditioning plants. The research work deals with the development of an innovative internal power recovery system for industrial cryogenic cooling plants. Such system is based on a Compressor-Expander Group (CEG). Both the expander and the compressor have been designed starting from automotive turbocharging components, strongly modified to take refrigerant fluid properties and specific system requirements into consideration. A preliminary choice of the machines (radial compressors and expanders) among existing components available on the market was realised according to the rules of the similarity theory. Once the expander was selected, it was strongly modified and performance verified by means of steady-state 3D CFD simulations. This paper focuses the attention on the development of the second CEG main component: the compressor. Once the preliminary selection has been done, the compressor geometry has been modified to take the new boundary conditions into account. In particular, the impeller has been machined to address the required total enthalpy increase. Such evaluation has been carried out by means of a simplified 1D model. Moreover, a vaneless diffuser has been added, modifying the shape of casing rear and front disks. To verify the performance of the modified compressor geometry and suggest improvements, a numerical fluid dynamic model has been set up and the commercial Ansys-CFX software has been used to perform steady-state 3D simulations. In this work, all the numerical results will be shown, highlighting critical aspects and suggesting further developments to increase compressor performance and flexibility.

Keywords: vapour compression systems, energy saving, refrigeration plant, organic fluids, centrifugal compressor

Procedia PDF Downloads 213
25458 Preparedness is Overrated: Community Responses to Floods in a Context of (Perceived) Low Probability

Authors: Kim Anema, Matthias Max, Chris Zevenbergen

Abstract:

For any flood risk manager the 'safety paradox' has to be a familiar concept: low probability leads to a sense of safety, which leads to more investments in the area, which leads to higher potential consequences: keeping the aggregated risk (probability*consequences) at the same level. Therefore, it is important to mitigate potential consequences apart from probability. However, when the (perceived) probability is so low that there is no recognizable trend for society to adapt to, addressing the potential consequences will always be the lagging point on the agenda. Preparedness programs fail because of lack of interest and urgency, policy makers are distracted by their day to day business and there's always a more urgent issue to spend the taxpayer's money on. The leading question in this study was how to address the social consequences of flooding in a context of (perceived) low probability. Disruptions of everyday urban life, large or small, can be caused by a variety of (un)expected things - of which flooding is only one possibility. Variability like this is typically addressed with resilience - and we used the concept of Community Resilience as the framework for this study. Drawing on face to face interviews, an extensive questionnaire and publicly available statistical data we explored the 'whole society response' to two recent urban flood events; the Brisbane Floods (AUS) in 2011 and the Dresden Floods (GE) in 2013. In Brisbane, we studied how the societal impacts of the floods were counteracted by both authorities and the public, and in Dresden we were able to validate our findings. A large part of the reactions, both public as institutional, to these two urban flood events were not fuelled by preparedness or proper planning. Instead, more important success factors in counteracting social impacts like demographic changes in neighborhoods and (non-)economic losses were dynamics like community action, flexibility and creativity from authorities, leadership, informal connections and a shared narrative. These proved to be the determining factors for the quality and speed of recovery in both cities. The resilience of the community in Brisbane was good, due to (i) the approachability of (local) authorities, (ii) a big group of ‘secondary victims’ and (iii) clear leadership. All three of these elements were amplified by the use of social media and/ or web 2.0 by both the communities and the authorities involved. The numerous contacts and social connections made through the web were fast, need driven and, in their own way, orderly. Similarly in Dresden large groups of 'unprepared', ad hoc organized citizens managed to work together with authorities in a way that was effective and speeded up recovery. The concept of community resilience is better fitted than 'social adaptation' to deal with the potential consequences of an (im)probable flood. Community resilience is built on capacities and dynamics that are part of everyday life and which can be invested in pre-event to minimize the social impact of urban flooding. Investing in these might even have beneficial trade-offs in other policy fields.

Keywords: community resilience, disaster response, social consequences, preparedness

Procedia PDF Downloads 350
25457 Saving Energy through Scalable Architecture

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

In this paper, we focus on the importance of scalable architecture for data centers and buildings in general to help an enterprise achieve environmental sustainability. The scalable architecture helps in many ways, such as adaptability to the business and user requirements, promotes high availability and disaster recovery solutions that are cost effective and low maintenance. The scalable architecture also plays a vital role in three core areas of sustainability: economy, environment, and social, which are also known as the 3 pillars of a sustainability model. If the architecture is scalable, it has many advantages. A few examples are that scalable architecture helps businesses and industries to adapt to changing technology, drive innovation, promote platform independence, and build resilience against natural disasters. Most importantly, having a scalable architecture helps industries bring in cost-effective measures for energy consumption, reduce wastage, increase productivity, and enable a robust environment. It also helps in the reduction of carbon emissions with advanced monitoring and metering capabilities. Scalable architectures help in reducing waste by optimizing the designs to utilize materials efficiently, minimize resources, decrease carbon footprints by using low-impact materials that are environmentally friendly. In this paper we also emphasize the importance of cultural shift towards the reuse and recycling of natural resources for a balanced ecosystem and maintain a circular economy. Also, since all of us are involved in the use of computers, much of the scalable architecture we have studied is related to data centers.

Keywords: scalable architectures, sustainability, application design, disruptive technology, machine learning and natural language processing, AI, social media platform, cloud computing, advanced networking and storage devices, advanced monitoring and metering infrastructure, climate change

Procedia PDF Downloads 97
25456 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems

Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan

Abstract:

Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.

Keywords: hybrid storage system, data mining, recurrent neural network, support vector machine

Procedia PDF Downloads 304
25455 Discussion on Big Data and One of Its Early Training Application

Authors: Fulya Gokalp Yavuz, Mark Daniel Ward

Abstract:

This study focuses on a contemporary and inevitable topic of Data Science and its exemplary application for early career building: Big Data and Leaving Learning Community (LLC). ‘Academia’ and ‘Industry’ have a common sense on the importance of Big Data. However, both of them are in a threat of missing the training on this interdisciplinary area. Some traditional teaching doctrines are far away being effective on Data Science. Practitioners needs some intuition and real-life examples how to apply new methods to data in size of terabytes. We simply explain the scope of Data Science training and exemplified its early stage application with LLC, which is a National Science Foundation (NSF) founded project under the supervision of Prof. Ward since 2014. Essentially, we aim to give some intuition for professors, researchers and practitioners to combine data science tools for comprehensive real-life examples with the guides of mentees’ feedback. As a result of discussing mentoring methods and computational challenges of Big Data, we intend to underline its potential with some more realization.

Keywords: Big Data, computation, mentoring, training

Procedia PDF Downloads 354
25454 Platform Development for Vero Cell Culture on Microcarriers Using Dissociation-Reassociation Method

Authors: Thanunthon Bowornsakulwong, Charukorn Charukarn, Franck Courtes, Panit Kitsubun, Lalintip Horcharoen

Abstract:

Vero cell is a continuous cell line that is widely used for the production of viral vaccines. However, due to its adherent characteristic, scaling up strategy in large-scale production remains complicated and thus limited. Consequently, suspension-like Vero cell culture processes based on microcarriers have been introduced and employed while also providing increased surface area per volume unit. However, harvesting Vero cells from microcarriers is a huge challenge due to difficulties in cells detaching, lower recovery yield, time-consuming and dissociation agent carry-over. To overcome these problems, we developed a dissociation-association platform technology for detaching and re-attaching cells during subculturing from microcarriers to microcarriers, which will be conveniently applied to seed trains strategies in large scale bioreactors. Herein, Hillex-2 was used to culture Vero cells in serum-containing media using spinner flasks as a scale-down model. The overall confluency of cells on microcarriers was observed using inverted microscope, and the sample cells were daily detached in order to obtain the kinetics data. The metabolites consumption and by-products formation were determined by Nova Biomedical BioprofileFlex.

Keywords: dissociation-reassociation, microcarrier, scale up, Vero cell

Procedia PDF Downloads 131
25453 Towards a Secure Storage in Cloud Computing

Authors: Mohamed Elkholy, Ahmed Elfatatry

Abstract:

Cloud computing has emerged as a flexible computing paradigm that reshaped the Information Technology map. However, cloud computing brought about a number of security challenges as a result of the physical distribution of computational resources and the limited control that users have over the physical storage. This situation raises many security challenges for data integrity and confidentiality as well as authentication and access control. This work proposes a security mechanism for data integrity that allows a data owner to be aware of any modification that takes place to his data. The data integrity mechanism is integrated with an extended Kerberos authentication that ensures authorized access control. The proposed mechanism protects data confidentiality even if data are stored on an untrusted storage. The proposed mechanism has been evaluated against different types of attacks and proved its efficiency to protect cloud data storage from different malicious attacks.

Keywords: access control, data integrity, data confidentiality, Kerberos authentication, cloud security

Procedia PDF Downloads 330
25452 Organisational Culture and the Role of the Mental Health Nurse: An Ethnography of the New Graduate Nurse Experience

Authors: Mary-Ellen Hooper, Graeme Browne, Anthony Paul O'Brien

Abstract:

Background: It has been reported that the experience of the organisational workplace culture for new graduate mental health nurses plays an important role in their attraction and retention to the discipline. Additionally, other research indicates that a negative workplace culture contributes to their dissatisfaction and attrition rate. Method: An ethnographic research design was applied to explore the subcultural experiences of new graduate nurses as they encounter mental health nursing. Data was collected between April and September 2017 across 6 separate Australian, NSW, mental health units. Data comprised of semi-structured interviews (n=24) and 31 episodes of field observation (62 hours). A total number of 26 new graduate and recent graduate nurses participated in the study – 14 new graduate nurses and 12 recently graduated nurses. Results: A key finding from this study was the New Graduate difficulty in articulating the role the of mental health nurse. Participants described a dichotomy between their ideological view of the mental health nurse and the reality of clinical practice. The participants’ ideological view of the mental health nurse involved providing holistic and individualised care within a flexible framework. Participants, however, described feeling powerless to change the recovery practices within the mental health service(s) because of their low status within the hierarchy. Resulting in participants choosing to fit into the existing culture, or considering leaving the field altogether. Conclusion: An incongruence between the values and ideals of an organisational culture and the reality shock of practice are shown to contribute to role ambiguity within its members. New graduate nurses entering the culture of mental health nursing describe role ambiguity resulting in dissatisfaction with practice. The culture and philosophy inherent to a service are posited to be crucial in creating positive experiences for graduate nurses.

Keywords: culture, mental health nurse, mental health nursing role, new graduate nurse

Procedia PDF Downloads 150
25451 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data

Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah

Abstract:

At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.

Keywords: Semantic Web, linked open data, database, statistic

Procedia PDF Downloads 173
25450 Reconstruction Post-mastectomy: A Literature Review on Its Indications and Techniques

Authors: Layaly Ayoub, Mariana Ribeiro

Abstract:

Introduction: Breast cancer is currently considered the leading cause of cancer-related deaths among women in Brazil. Mastectomy, essential in this treatment, often necessitates subsequent breast reconstruction to restore physical appearance and aid in the emotional and psychological recovery of patients. The choice between immediate or delayed reconstruction is influenced by factors such as the type and stage of cancer, as well as the patient's overall health. The decision between autologous breast reconstruction or implant-based reconstruction requires a detailed analysis of individual conditions and needs. Objectives: This study analyzes the techniques and indications used in post-mastectomy breast reconstruction. Methodology: Literature review conducted in the PubMed and SciELO databases, focusing on articles that met the inclusion and exclusion criteria and descriptors. Results: After mastectomy, breast reconstruction is commonly performed. It is necessary to determine the type of technique to be used in each case depending on the specific characteristics of each patient. The tissue expander technique is indicated for patients with sufficient skin and tissue post-mastectomy, who do not require additional radiotherapy, and who opt for a less complex surgery with a shorter recovery time. This procedure promotes the gradual expansion of soft tissues where the definitive implant will be placed. Both temporary and permanent expanders offer flexibility, allowing for adjustment in the expander size until the desired volume is reached, enabling the skin and tissues to adapt to the breast implant area. Conversely, autologous reconstruction is indicated for patients who will undergo radiotherapy, have insufficient tissue, and prefer a more natural solution. This technique uses the transverse rectus abdominis muscle (TRAM) flap, the latissimus dorsi muscle flap, the gluteal flap, and local muscle flaps to shape a new breast, potentially combined with a breast implant. Conclusion: In this context, it is essential to conduct a thorough evaluation regarding the technique to be applied, as both have their benefits and challenges.

Keywords: indications, post-mastectomy, breast reconstruction, techniques

Procedia PDF Downloads 25
25449 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges

Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh

Abstract:

For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.

Keywords: guideline, law, data protection officer, personal data

Procedia PDF Downloads 75
25448 Simulation of Hydraulic Fracturing Fluid Cleanup for Partially Degraded Fracturing Fluids in Unconventional Gas Reservoirs

Authors: Regina A. Tayong, Reza Barati

Abstract:

A stable, fast and robust three-phase, 2D IMPES simulator has been developed for assessing the influence of; breaker concentration on yield stress of filter cake and broken gel viscosity, varying polymer concentration/yield stress along the fracture face, fracture conductivity, fracture length, capillary pressure changes and formation damage on fracturing fluid cleanup in tight gas reservoirs. This model has been validated as against field data reported in the literature for the same reservoir. A 2-D, two-phase (gas/water) fracture propagation model is used to model our invasion zone and create the initial conditions for our clean-up model by distributing 200 bbls of water around the fracture. A 2-D, three-phase IMPES simulator, incorporating a yield-power-law-rheology has been developed in MATLAB to characterize fluid flow through a hydraulically fractured grid. The variation in polymer concentration along the fracture is computed from a material balance equation relating the initial polymer concentration to total volume of injected fluid and fracture volume. All governing equations and the methods employed have been adequately reported to permit easy replication of results. The effect of increasing capillary pressure in the formation simulated in this study resulted in a 10.4% decrease in cumulative production after 100 days of fluid recovery. Increasing the breaker concentration from 5-15 gal/Mgal on the yield stress and fluid viscosity of a 200 lb/Mgal guar fluid resulted in a 10.83% increase in cumulative gas production. For tight gas formations (k=0.05 md), fluid recovery increases with increasing shut-in time, increasing fracture conductivity and fracture length, irrespective of the yield stress of the fracturing fluid. Mechanical induced formation damage combined with hydraulic damage tends to be the most significant. Several correlations have been developed relating pressure distribution and polymer concentration to distance along the fracture face and average polymer concentration variation with injection time. The gradient in yield stress distribution along the fracture face becomes steeper with increasing polymer concentration. The rate at which the yield stress (τ_o) is increasing is found to be proportional to the square of the volume of fluid lost to the formation. Finally, an improvement on previous results was achieved through simulating yield stress variation along the fracture face rather than assuming constant values because fluid loss to the formation and the polymer concentration distribution along the fracture face decreases as we move away from the injection well. The novelty of this three-phase flow model lies in its ability to (i) Simulate yield stress variation with fluid loss volume along the fracture face for different initial guar concentrations. (ii) Simulate increasing breaker activity on yield stress and broken gel viscosity and the effect of (i) and (ii) on cumulative gas production within reasonable computational time.

Keywords: formation damage, hydraulic fracturing, polymer cleanup, multiphase flow numerical simulation

Procedia PDF Downloads 128
25447 Data Collection Based on the Questionnaire Survey In-Hospital Emergencies

Authors: Nouha Mhimdi, Wahiba Ben Abdessalem Karaa, Henda Ben Ghezala

Abstract:

The methods identified in data collection are diverse: electronic media, focus group interviews and short-answer questionnaires [1]. The collection of poor-quality data resulting, for example, from poorly designed questionnaires, the absence of good translators or interpreters, and the incorrect recording of data allow conclusions to be drawn that are not supported by the data or to focus only on the average effect of the program or policy. There are several solutions to avoid or minimize the most frequent errors, including obtaining expert advice on the design or adaptation of data collection instruments; or use technologies allowing better "anonymity" in the responses [2]. In this context, we opted to collect good quality data by doing a sizeable questionnaire-based survey on hospital emergencies to improve emergency services and alleviate the problems encountered. At the level of this paper, we will present our study, and we will detail the steps followed to achieve the collection of relevant, consistent and practical data.

Keywords: data collection, survey, questionnaire, database, data analysis, hospital emergencies

Procedia PDF Downloads 103
25446 Effects of Tramadol Administration on the Ovary of Adult Rats and the Possible Recovery after Tramadol Withdrawal: A Light and Electron Microscopic Study

Authors: Heba Kamal Mohamed

Abstract:

Introduction: Tramadol is a weak -opioid receptor agonist with an analgesic effect because of the inhibition of uptake of norepinephrine and serotonin. Nowadays, tramadol hydrochloride is frequently used as a pain reliever. Tramadol is recommended for the management of acute and chronic pain of moderate to severe intensity associated with a variety of diseases or problems, including osteoarthritis, diabetic neuropathy, neuropathic pain, and even perioperative pain in human patients. In obstetrics and gynecology, tramadol is used extensively to treat postoperative pain. Aim of the study: This study was undertaken to investigate the histological (light and electron microscopic) and immunohistochemical effects of long term tramadol treatment on the ovary of adult rats and the possible recovery after tramadol withdrawal. Design: Experimental study. Materials and methods: Thirty adult female albino rats were used in this study. They were classified into three main groups (10 rats each). Group I served as the control group. Group II, rats were subcutaneously injected with tramadol 40 mg/kg three times per week for 8 weeks. Group III, rats were subcutaneously injected with tramadol 40 mg/kg three times per week for 8 weeks then were kept for another 8 weeks without treatment for recovery. At the end of the experiment rats were sacrificed and bilateral oophorectomy was carried out; the ovaries were processed for histological study (light and electron microscopic) and immunohistochemical reaction for caspase-3 (apoptotic protein). Results: Examination of the ovary of tramadol-treated rats (group II) revealed many atretic ovarian follicles, some follicles showed detachment of the oocyte from surrounding granulosa cells and others showed loss of the oocyte. Many follicles revealed degenerated vacuolated oocytes and vacuolated theca folliculi cells. Granulosa cells appeared shrunken, disrupted and loosely attached with vacuolated cytoplasm and pyknotic nuclei. Some follicles showed separation of granulosa cells from the theca folliculi layer. The ultrastructural study revealed the presence of granulosa cells with electron dense indented nuclei, damaged mitochondria and granular vacuolated cytoplasm. Other cells showed accumulation of large amount of lipid droplets in their cytoplasm. Some follicles revealed rarifaction of the cytoplasm of oocytes and absent zona pellucida. Moreover, apoptotic changes were detected by immunohistochemical staining in the form of increased staining intensity to caspase-3 (apoptotic protein). With Masson's Trichrome stain, there was an increased collagen fibre deposition in the ovarian cortical stroma. The wall of blood vessels appeared thickened. In the withdrawal group (group III), there was a little improvement in the histological and immunohistochemical changes. Conclusion: Tramadol had serious deleterious effects on ovarian structure. Thus, it should be used with caution, especially when a long term treatment is indicated. Withdrawal of tramadol led to a little improvement in the structural impairment of the ovary.

Keywords: tramadol, ovary, withdrawal, rats

Procedia PDF Downloads 292
25445 Federated Learning in Healthcare

Authors: Ananya Gangavarapu

Abstract:

Convolutional Neural Networks (CNN) based models are providing diagnostic capabilities on par with the medical specialists in many specialty areas. However, collecting the medical data for training purposes is very challenging because of the increased regulations around data collections and privacy concerns around personal health data. The gathering of the data becomes even more difficult if the capture devices are edge-based mobile devices (like smartphones) with feeble wireless connectivity in rural/remote areas. In this paper, I would like to highlight Federated Learning approach to mitigate data privacy and security issues.

Keywords: deep learning in healthcare, data privacy, federated learning, training in distributed environment

Procedia PDF Downloads 138