Search results for: data stream
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25136

Search results for: data stream

24926 Numerical Study of Laminar Separation Bubble Over an Airfoil Using γ-ReθT SST Turbulence Model on Moderate Reynolds Number

Authors: Younes El Khchine, Mohammed Sriti

Abstract:

A parametric study has been conducted to analyse the flow around S809 airfoil of wind turbine in order to better understand the characteristics and effects of laminar separation bubble (LSB) on aerodynamic design for maximizing wind turbine efficiency. Numerical simulations were performed at low Reynolds number by solving the Unsteady Reynolds Averaged Navier-Stokes (URANS) equations based on C-type structural mesh and using γ-Reθt turbulence model. Two-dimensional study was conducted for the chord Reynolds number of 1×105 and angles of attack (AoA) between 0 and 20.15 degrees. The simulation results obtained for the aerodynamic coefficients at various angles of attack (AoA) were compared with XFoil results. A sensitivity study was performed to examine the effects of Reynolds number and free-stream turbulence intensity on the location and length of laminar separation bubble and aerodynamic performances of wind turbine. The results show that increasing the Reynolds number leads to a delay in the laminar separation on the upper surface of the airfoil. The increase in Reynolds number leads to an accelerate transition process and the turbulent reattachment point move closer to the leading edge owing to an earlier reattachment of the turbulent shear layer. This leads to a considerable reduction in the length of the separation bubble as the Reynolds number is increased. The increase of the level of free-stream turbulence intensity leads to a decrease in separation bubble length and an increase the lift coefficient while having negligible effects on the stall angle. When the AoA increased, the bubble on the suction airfoil surface was found to moves upstream to leading edge of the airfoil that causes earlier laminar separation.

Keywords: laminar separation bubble, turbulence intensity, S809 airfoil, transition model, Reynolds number

Procedia PDF Downloads 71
24925 Subband Coding and Glottal Closure Instant (GCI) Using SEDREAMS Algorithm

Authors: Harisudha Kuresan, Dhanalakshmi Samiappan, T. Rama Rao

Abstract:

In modern telecommunication applications, Glottal Closure Instants location finding is important and is directly evaluated from the speech waveform. Here, we study the GCI using Speech Event Detection using Residual Excitation and the Mean Based Signal (SEDREAMS) algorithm. Speech coding uses parameter estimation using audio signal processing techniques to model the speech signal combined with generic data compression algorithms to represent the resulting modeled in a compact bit stream. This paper proposes a sub-band coder SBC, which is a type of transform coding and its performance for GCI detection using SEDREAMS are evaluated. In SBCs code in the speech signal is divided into two or more frequency bands and each of these sub-band signal is coded individually. The sub-bands after being processed are recombined to form the output signal, whose bandwidth covers the whole frequency spectrum. Then the signal is decomposed into low and high-frequency components and decimation and interpolation in frequency domain are performed. The proposed structure significantly reduces error, and precise locations of Glottal Closure Instants (GCIs) are found using SEDREAMS algorithm.

Keywords: SEDREAMS, GCI, SBC, GOI

Procedia PDF Downloads 347
24924 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach

Authors: Theertha Chandroth

Abstract:

This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.

Keywords: XML, JSON, data comparison, integration testing, Python, SQL

Procedia PDF Downloads 128
24923 A Smart Sensor Network Approach Using Affordable River Water Level Sensors

Authors: Dian Zhang, Brendan Heery, Maria O’Neill, Ciprian Briciu-Burghina, Noel E. O’Connor, Fiona Regan

Abstract:

Recent developments in sensors, wireless data communication and the cloud computing have brought the sensor web to a whole new generation. The introduction of the concept of ‘Internet of Thing (IoT)’ has brought the sensor research into a new level, which involves the developing of long lasting, low cost, environment friendly and smart sensors; new wireless data communication technologies; big data analytics algorithms and cloud based solutions that are tailored to large scale smart sensor network. The next generation of smart sensor network consists of several layers: physical layer, where all the smart sensors resident and data pre-processes occur, either on the sensor itself or field gateway; data transmission layer, where data and instructions exchanges happen; the data process layer, where meaningful information is extracted and organized from the pre-process data stream. There are many definitions of smart sensor, however, to summarize all these definitions, a smart sensor must be Intelligent and Adaptable. In future large scale sensor network, collected data are far too large for traditional applications to send, store or process. The sensor unit must be intelligent that pre-processes collected data locally on board (this process may occur on field gateway depends on the sensor network structure). In this case study, three smart sensing methods, corresponding to simple thresholding, statistical model and machine learning based MoPBAS method, are introduced and their strength and weakness are discussed as an introduction to the smart sensing concept. Data fusion, the integration of data and knowledge from multiple sources, are key components of the next generation smart sensor network. For example, in the water level monitoring system, weather forecast can be extracted from external sources and if a heavy rainfall is expected, the server can send instructions to the sensor notes to, for instance, increase the sampling rate or switch on the sleeping mode vice versa. In this paper, we describe the deployment of 11 affordable water level sensors in the Dublin catchment. The objective of this paper is to use the deployed river level sensor network at the Dodder catchment in Dublin, Ireland as a case study to give a vision of the next generation of a smart sensor network for flood monitoring to assist agencies in making decisions about deploying resources in the case of a severe flood event. Some of the deployed sensors are located alongside traditional water level sensors for validation purposes. Using the 11 deployed river level sensors in a network as a case study, a vision of the next generation of smart sensor network is proposed. Each key component of the smart sensor network is discussed, which hopefully inspires the researchers who are working in the sensor research domain.

Keywords: smart sensing, internet of things, water level sensor, flooding

Procedia PDF Downloads 374
24922 Multi-Source Data Fusion for Urban Comprehensive Management

Authors: Bolin Hua

Abstract:

In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.

Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data

Procedia PDF Downloads 384
24921 Reviewing Privacy Preserving Distributed Data Mining

Authors: Sajjad Baghernezhad, Saeideh Baghernezhad

Abstract:

Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.

Keywords: data mining, distributed data mining, privacy protection, privacy preserving

Procedia PDF Downloads 517
24920 The Right to Data Portability and Its Influence on the Development of Digital Services

Authors: Roman Bieda

Abstract:

The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.

Keywords: data portability, digital market, GDPR, personal data

Procedia PDF Downloads 468
24919 Knowledge Management and Motivation Management: Important Constituents of Firm Performance

Authors: Yassir Mahmood, Nadia Ehsan

Abstract:

In current research stream, empirical work regarding knowledge and motivation management along their dimensions is sparse. This study partially filled this void by investigating the influence of knowledge management (tacit and explicit) and motivation management (intrinsic and extrinsic) on firm performance with the mediating effects of innovative performance. Based on the quantitative research method, data were collected through questionnaire from 284 employees working in 18 different firms across the citrus industry located in Sargodha region (Pakistan). The proposed relationships were tested through regression analysis while mediation relations were analyzed through Barron and Kenny (1986) technique. The results suggested that knowledge management (KM) and motivation management (MM) have significant positive impacts on innovative performance (IP). In addition, the role of IP as full mediator between KM and firm performance (FP) is confirmed. Also, IP proved to be a partial mediator between MM and FP. From the managerial perspective, the findings of the study are vital as some of the important constituents of FP have been highlighted. The study produced important underpinnings for managers. In last, implications for policymakers along with future research directions are discussed.

Keywords: innovative performance, firm performance, knowledge management, motivation management, Sargodha

Procedia PDF Downloads 153
24918 Flow Reproduction Using Vortex Particle Methods for Wake Buffeting Analysis of Bluff Structures

Authors: Samir Chawdhury, Guido Morgenthal

Abstract:

The paper presents a novel extension of Vortex Particle Methods (VPM) where the study aims to reproduce a template simulation of complex flow field that is generated from impulsively started flow past an upstream bluff body at certain Reynolds number Re-Vibration of a structural system under upstream wake flow is often considered its governing design criteria. Therefore, the attention is given in this study especially for the reproduction of wake flow simulation. The basic methodology for the implementation of the flow reproduction requires the downstream velocity sampling from the template flow simulation; therefore, at particular distances from the upstream section the instantaneous velocity components are sampled using a series of square sampling-cells arranged vertically where each of the cell contains four velocity sampling points at its corner. Since the grid free Lagrangian VPM algorithm discretises vorticity on particle elements, the method requires transformation of the velocity components into vortex circulation, and finally the simulation of the reproduction of the template flow field by seeding these vortex circulations or particles into a free stream flow. It is noteworthy that the vortex particles have to be released into the free stream exactly at same rate of velocity sampling. Studies have been done, specifically, in terms of different sampling rates and velocity sampling positions to find their effects on flow reproduction quality. The quality assessments are mainly done, using a downstream flow monitoring profile, by comparing the characteristic wind flow profiles using several statistical turbulence measures. Additionally, the comparisons are performed using velocity time histories, snapshots of the flow fields, and the vibration of a downstream bluff section by performing wake buffeting analyses of the section under the original and reproduced wake flows. Convergence study is performed for the validation of the method. The study also describes the possibilities how to achieve flow reproductions with less computational effort.

Keywords: vortex particle method, wake flow, flow reproduction, wake buffeting analysis

Procedia PDF Downloads 307
24917 Spatial Rank-Based High-Dimensional Monitoring through Random Projection

Authors: Chen Zhang, Nan Chen

Abstract:

High-dimensional process monitoring becomes increasingly important in many application domains, where usually the process distribution is unknown and much more complicated than the normal distribution, and the between-stream correlation can not be neglected. However, since the process dimension is generally much bigger than the reference sample size, most traditional nonparametric multivariate control charts fail in high-dimensional cases due to the curse of dimensionality. Furthermore, when the process goes out of control, the influenced variables are quite sparse compared with the whole dimension, which increases the detection difficulty. Targeting at these issues, this paper proposes a new nonparametric monitoring scheme for high-dimensional processes. This scheme first projects the high-dimensional process into several subprocesses using random projections for dimension reduction. Then, for every subprocess with the dimension much smaller than the reference sample size, a local nonparametric control chart is constructed based on the spatial rank test to detect changes in this subprocess. Finally, the results of all the local charts are fused together for decision. Furthermore, after an out-of-control (OC) alarm is triggered, a diagnostic framework is proposed. using the square-root LASSO. Numerical studies demonstrate that the chart has satisfactory detection power for sparse OC changes and robust performance for non-normally distributed data, The diagnostic framework is also effective to identify truly changed variables. Finally, a real-data example is presented to demonstrate the application of the proposed method.

Keywords: random projection, high-dimensional process control, spatial rank, sequential change detection

Procedia PDF Downloads 294
24916 A High Compression Ratio for a Losseless Image Compression Based on the Arithmetic Coding with the Sorted Run Length Coding: Meteosat Second Generation Image Compression

Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane

Abstract:

Image compression is the heart of several multimedia techniques. It is used to reduce the number of bits required to represent an image. Meteosat Second Generation (MSG) satellite allows the acquisition of 12 image files every 15 minutes and that results in a large databases sizes. In this paper, a novel image compression method based on the arithmetic coding with the sorted Run Length Coding (SRLC) for MSG images is proposed. The SRLC allows us to find the occurrence of the consecutive pixels of the original image to create a sorted run. The arithmetic coding allows the encoding of the sorted data of the previous stage to retrieve a unique code word that represents a binary code stream in the sorted order to boost the compression ratio. Through this article, we show that our method can perform the best results concerning compression ratio and bit rate unlike the method based on the Run Length Coding (RLC) and the arithmetic coding. Evaluation criteria like the compression ratio and the bit rate allow the confirmation of the efficiency of our method of image compression.

Keywords: image compression, arithmetic coding, Run Length Coding, RLC, Sorted Run Length Coding, SRLC, Meteosat Second Generation, MSG

Procedia PDF Downloads 346
24915 Predicting the Turbulence Intensity, Excess Energy Available and Potential Power Generated by Building Mounted Wind Turbines over Four Major UK City

Authors: Emejeamara Francis

Abstract:

The future of potentials wind energy applications within suburban/urban areas are currently faced with various problems. These include insufficient assessment of urban wind resource, and the effectiveness of commercial gust control solutions as well as unavailability of effective and cheaper valuable tools for scoping the potentials of urban wind applications within built-up environments. In order to achieve effective assessment of the potentials of urban wind installations, an estimation of the total energy that would be available to them were effective control systems to be used, and evaluating the potential power to be generated by the wind system is required. This paper presents a methodology of predicting the power generated by a wind system operating within an urban wind resource. This method was developed by using high temporal resolution wind measurements from eight potential sites within the urban and suburban environment as inputs to a vertical axis wind turbine multiple stream tube model. A relationship between the unsteady performance coefficient obtained from the stream tube model results and turbulence intensity was demonstrated. Hence, an analytical methodology for estimating the unsteady power coefficient at a potential turbine site is proposed. This is combined with analytical models that were developed to predict the wind speed and the excess energy (EEC) available in estimating the potential power generated by wind systems at different heights within a built environment. Estimates of turbulence intensities, wind speed, EEC and turbine performance based on the current methodology allow a more complete assessment of available wind resource and potential urban wind projects. This methodology is applied to four major UK cities namely Leeds, Manchester, London and Edinburgh and the potential to map the turbine performance at different heights within a typical urban city is demonstrated.

Keywords: small-scale wind, turbine power, urban wind energy, turbulence intensity, excess energy content

Procedia PDF Downloads 270
24914 Factors Promoting French-English Tweets in France

Authors: Taoues Hadour

Abstract:

Twitter has become a popular means of communication used in a variety of fields, such as politics, journalism, and academia. This widely used online platform has an impact on the way people express themselves and is changing language usage worldwide at an unprecedented pace. The language used online reflects the linguistic battle that has been going on for several decades in French society. This study enables a deeper understanding of users' linguistic behavior online. The implications are important and allow for a rise in awareness of intercultural and cross-language exchanges. This project investigates the mixing of French-English language usage among French users of Twitter using a topic analysis approach. This analysis draws on Gumperz's theory of conversational switching. In order to collect tweets at a large scale, the data was collected in R using the rtweet package to access and retrieve French tweets data through Twitter’s REST and stream APIs (Application Program Interface) using the software RStudio, the integrated development environment for R. The dataset was filtered manually and certain repetitions of themes were observed. A total of nine topic categories were identified and analyzed in this study: entertainment, internet/social media, events/community, politics/news, sports, sex/pornography, innovation/technology, fashion/make up, and business. The study reveals that entertainment is the most frequent topic discussed on Twitter. Entertainment includes movies, music, games, and books. Anglicisms such as trailer, spoil, and live are identified in the data. Change in language usage is inevitable and is a natural result of linguistic interactions. The use of different languages online is just an example of what the real world would look like without linguistic regulations. Social media reveals a multicultural and multilinguistic richness which can deepen and expand our understanding of contemporary human attitudes.

Keywords: code-switching, French, sociolinguistics, Twitter

Procedia PDF Downloads 130
24913 Linearization and Process Standardization of Construction Design Engineering Workflows

Authors: T. R. Sreeram, S. Natarajan, C. Jena

Abstract:

Civil engineering construction is a network of tasks involving varying degree of complexity and streamlining, and standardization is the only way to establish a systemic approach to design. While there are off the shelf tools such as AutoCAD that play a role in the realization of design, the repeatable process in which these tools are deployed often is ignored. The present paper addresses this challenge through a sustainable design process and effective standardizations at all stages in the design workflow. The same is demonstrated through a case study in the context of construction, and further improvement points are highlighted.

Keywords: syste, lean, value stream, process improvement

Procedia PDF Downloads 118
24912 Recent Advances in Data Warehouse

Authors: Fahad Hanash Alzahrani

Abstract:

This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.

Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing

Procedia PDF Downloads 395
24911 A Systematic Review of Street-Level Policy Entrepreneurship Strategies in Different Political Contexts

Authors: Hui Wang, Huan Zhang

Abstract:

This study uses systematic review and qualitative comparative analysis methods to comprehensively inquire about the recent street-level policy entrepreneurship research, to identify the characteristics and lessons we can learn from 20 years of street-level policy entrepreneurship literature, and the relations between political contexts and street-level policy entrepreneurs’ strategies. Using data from a systematic review of street-level policy entrepreneurship literature, we identify the sub-components of different political contexts and core strategies of street-level policy entrepreneurs and estimate the configurational relations between different political settings and street-level policy entrepreneurs’ strategies. Our results show that street-level policy entrepreneurs display social acuity, define the problem, and build team strategies when policy or political streams dominate. Street-level policy entrepreneurs will use lead-by-example strategies when both policy and political streams dominate. Furthermore, street-level policy entrepreneurs will use bureaucratic strategies, even if no stream dominates in the political context.

Keywords: policy entrepreneurs, qualitative comparative analysis, street-level bureaucracy, systematic review

Procedia PDF Downloads 100
24910 Linear Evolution of Compressible Görtler Vortices Subject to Free-Stream Vortical Disturbances

Authors: Samuele Viaro, Pierre Ricco

Abstract:

Görtler instabilities generate in boundary layers from an unbalance between pressure and centrifugal forces caused by concave surfaces. Their spatial streamwise evolution influences transition to turbulence. It is therefore important to understand even the early stages where perturbations, still small, grow linearly and could be controlled more easily. This work presents a rigorous theoretical framework for compressible flows using the linearized unsteady boundary region equations, where only the streamwise pressure gradient and streamwise diffusion terms are neglected from the full governing equations of fluid motion. Boundary and initial conditions are imposed through an asymptotic analysis in order to account for the interaction of the boundary layer with free-stream turbulence. The resulting parabolic system is discretize with a second-order finite difference scheme. Realistic flow parameters are chosen from wind tunnel studies performed at supersonic and subsonic conditions. The Mach number ranges from 0.5 to 8, with two different radii of curvature, 5 m and 10 m, frequencies up to 2000 Hz, and vortex spanwise wavelengths from 5 mm to 20 mm. The evolution of the perturbation flow is shown through velocity, temperature, pressure profiles relatively close to the leading edge, where non-linear effects can still be neglected, and growth rate. Results show that a global stabilizing effect exists with the increase of Mach number, frequency, spanwise wavenumber and radius of curvature. In particular, at high Mach numbers curvature effects are less pronounced and thermal streaks become stronger than velocity streaks. This increase of temperature perturbations saturates at approximately Mach 4 flows, and is limited in the early stage of growth, near the leading edge. In general, Görtler vortices evolve closer to the surface with respect to a flat plate scenario but their location shifts toward the edge of the boundary layer as the Mach number increases. In fact, a jet-like behavior appears for steady vortices having small spanwise wavelengths (less than 10 mm) at Mach 8, creating a region of unperturbed flow close to the wall. A similar response is also found at the highest frequency considered for a Mach 3 flow. Larger vortices are found to have a higher growth rate but are less influenced by the Mach number. An eigenvalue approach is also employed to study the amplification of the perturbations sufficiently downstream from the leading edge. These eigenvalue results are compared with the ones obtained through the initial value approach with inhomogeneous free-stream boundary conditions. All of the parameters here studied have a significant influence on the evolution of the instabilities for the Görtler problem which is indeed highly dependent on initial conditions.

Keywords: compressible boundary layers, Görtler instabilities, receptivity, turbulence transition

Procedia PDF Downloads 247
24909 How to Use Big Data in Logistics Issues

Authors: Mehmet Akif Aslan, Mehmet Simsek, Eyup Sensoy

Abstract:

Big Data stands for today’s cutting-edge technology. As the technology becomes widespread, so does Data. Utilizing massive data sets enable companies to get competitive advantages over their adversaries. Out of many area of Big Data usage, logistics has significance role in both commercial sector and military. This paper lays out what big data is and how it is used in both military and commercial logistics.

Keywords: big data, logistics, operational efficiency, risk management

Procedia PDF Downloads 637
24908 Phytoremediation of Cr from Tannery Effluent by Vetiver Grass

Authors: Mingizem Gashaw Seid

Abstract:

Phytoremediation of chromium metal by vetiver grass was investigated in hydroponic system. The removal efficiency for organic load, nutrient and chromium were evaluated as a function of concentration of waste effluent (40 and 50% dilution with distilled water). Under this conditions 64.49-94.06 % of chromium was removed. This shows vetiver grass has potential for accumulation of chromium metal from tannery waste water stream.

Keywords: chromium, phytoremediation, tannery effluent, vetiver grass

Procedia PDF Downloads 410
24907 Hydrographic Mapping Based on the Concept of Fluvial-Geomorphological Auto-Classification

Authors: Jesús Horacio, Alfredo Ollero, Víctor Bouzas-Blanco, Augusto Pérez-Alberti

Abstract:

Rivers have traditionally been classified, assessed and managed in terms of hydrological, chemical and / or biological criteria. Geomorphological classifications had in the past a secondary role, although proposals like River Styles Framework, Catchment Baseline Survey or Stroud Rural Sustainable Drainage Project did incorporate geomorphology for management decision-making. In recent years many studies have been attracted to the geomorphological component. The geomorphological processes and their associated forms determine the structure of a river system. Understanding these processes and forms is a critical component of the sustainable rehabilitation of aquatic ecosystems. The fluvial auto-classification approach suggests that a river is a self-built natural system, with processes and forms designed to effectively preserve their ecological function (hydrologic, sedimentological and biological regime). Fluvial systems are formed by a wide range of elements with multiple non-linear interactions on different spatial and temporal scales. Besides, the fluvial auto-classification concept is built using data from the river itself, so that each classification developed is peculiar to the river studied. The variables used in the classification are specific stream power and mean grain size. A discriminant analysis showed that these variables are the best characterized processes and forms. The statistical technique applied allows to get an individual discriminant equation for each geomorphological type. The geomorphological classification was developed using sites with high naturalness. Each site is a control point of high ecological and geomorphological quality. The changes in the conditions of the control points will be quickly recognizable, and easy to apply a right management measures to recover the geomorphological type. The study focused on Galicia (NW Spain) and the mapping was made analyzing 122 control points (sites) distributed over eight river basins. In sum, this study provides a method for fluvial geomorphological classification that works as an open and flexible tool underlying the fluvial auto-classification concept. The hydrographic mapping is the visual expression of the results, such that each river has a particular map according to its geomorphological characteristics. Each geomorphological type is represented by a particular type of hydraulic geometry (channel width, width-depth ratio, hydraulic radius, etc.). An alteration of this geometry is indicative of a geomorphological disturbance (whether natural or anthropogenic). Hydrographic mapping is also dynamic because its meaning changes if there is a modification in the specific stream power and/or the mean grain size, that is, in the value of their equations. The researcher has to check annually some of the control points. This procedure allows to monitor the geomorphology quality of the rivers and to see if there are any alterations. The maps are useful to researchers and managers, especially for conservation work and river restoration.

Keywords: fluvial auto-classification concept, mapping, geomorphology, river

Procedia PDF Downloads 362
24906 Temperature Effects on CO₂ Intake of MIL-101 and ZIF-301

Authors: M. Ba-Shammakh

Abstract:

Metal-organic frameworks (MOFs) are promising materials for CO₂ capture and they have high adsorption capacity towards CO₂. In this study, two different metal organic frameworks (i.e. MIL-101 and ZIF-301) were tested for different flue gases that have different CO₂ fractions. In addition, the effect of temperature was investigated for MIL-101 and ZIF-301. The results show that MIL-101 performs well for pure CO₂ stream while its intake decreases dramatically for other flue gases that have variable CO₂ fraction ranging from 5 to 15 %. The second material (ZIF-301) showed a better result in all flue gases and higher CO₂ intake compared to MIL-101 even at high temperature.

Keywords: CO₂ capture, Metal Organic Frameworks (MOFs), MIL-101, ZIF-301

Procedia PDF Downloads 185
24905 Evaluating the Effect of Climate Change and Land Use/Cover Change on Catchment Hydrology of Gumara Watershed, Upper Blue Nile Basin, Ethiopia

Authors: Gashaw Gismu Chakilu

Abstract:

Climate and land cover change are very important issues in terms of global context and their responses to environmental and socio-economic drivers. The dynamic of these two factors is currently affecting the environment in unbalanced way including watershed hydrology. In this paper individual and combined impacts of climate change and land use land cover change on hydrological processes were evaluated through applying the model Soil and Water Assessment Tool (SWAT) in Gumara watershed, Upper Blue Nile basin Ethiopia. The regional climate; temperature and rainfall data of the past 40 years in the study area were prepared and changes were detected by using trend analysis applying Mann-Kendall trend test. The land use land cover data were obtained from land sat image and processed by ERDAS IMAGIN 2010 software. Three land use land cover data; 1973, 1986, and 2013 were prepared and these data were used for base line, model calibration and change study respectively. The effects of these changes on high flow and low flow of the catchment have also been evaluated separately. The high flow of the catchment for these two decades was analyzed by using Annual Maximum (AM) model and the low flow was evaluated by seven day sustained low flow model. Both temperature and rainfall showed increasing trend; and then the extent of changes were evaluated in terms of monthly bases by using two decadal time periods; 1973-1982 was taken as baseline and 2004-2013 was used as change study. The efficiency of the model was determined by Nash-Sutcliffe (NS) and Relative Volume error (RVe) and their values were 0.65 and 0.032 for calibration and 0.62 and 0.0051 for validation respectively. The impact of climate change was higher than that of land use land cover change on stream flow of the catchment; the flow has been increasing by 16.86% and 7.25% due to climate and LULC change respectively, and the combined change effect accounted 22.13% flow increment. The overall results of the study indicated that Climate change is more responsible for high flow than low flow; and reversely the land use land cover change showed more significant effect on low flow than high flow of the catchment. From the result we conclude that the hydrology of the catchment has been altered because of changes of climate and land cover of the study area.

Keywords: climate, LULC, SWAT, Ethiopia

Procedia PDF Downloads 372
24904 Recovery of Draw Solution in Forward Osmosis by Direct Contact Membrane Distillation

Authors: Su-Thing Ho, Shiao-Shing Chen, Hung-Te Hsu, Saikat Sinha Ray

Abstract:

Forward osmosis (FO) is an emerging technology for direct and indirect potable water reuse application. However, successful implementation of FO is still hindered by the lack of draw solution recovery with high efficiency. Membrane distillation (MD) is a thermal separation process by using hydrophobic microporous membrane that is kept in sandwich mode between warm feed stream and cold permeate stream. Typically, temperature difference is the driving force of MD which attributed by the partial vapor pressure difference across the membrane. In this study, the direct contact membrane distillation (DCMD) system was used to recover diluted draw solution of FO. Na3PO4 at pH 9 and EDTA-2Na at pH 8 were used as the feed solution for MD since it produces high water flux and minimized salt leakage in FO process. At high pH, trivalent and tetravalent ions are much easier to remain at draw solution side in FO process. The result demonstrated that PTFE with pore size of 1 μm could achieve the highest water flux (12.02 L/m2h), followed by PTFE 0.45 μm (10.05 L/m2h), PTFE 0.1 μm (7.38 L/m2h) and then PP (7.17 L/m2h) while using 0.1 M Na3PO4 draw solute. The concentration of phosphate and conductivity in the PTFE (0.45 μm) permeate were low as 1.05 mg/L and 2.89 μm/cm respectively. Although PTFE with the pore size of 1 μm could obtain the highest water flux, but the concentration of phosphate in permeate was higher than other kinds of MD membranes. This study indicated that four kinds of MD membranes performed well and PTFE with the pore size of 0.45 μm was the best among tested membranes to achieve high water flux and high rejection of phosphate (99.99%) in recovery of diluted draw solution. Besides that, the results demonstrate that it can obtain high water flux and high rejection of phosphate when operated with cross flow velocity of 0.103 m/s with Tfeed of 60 ℃ and Tdistillate of 20 ℃. In addition to that, the result shows that Na3PO4 is more suitable for recovery than EDTA-2Na. Besides that, while recovering the diluted Na3PO4, it can obtain the high purity of permeate water. The overall performance indicates that, the utilization of DCMD is a promising technology to recover the diluted draw solution for FO process.

Keywords: membrane distillation, forward osmosis, draw solution, recovery

Procedia PDF Downloads 180
24903 Development of an Asset Database to Enhance the Circular Business Models for the European Solar Industry: A Design Science Research Approach

Authors: Ässia Boukhatmi, Roger Nyffenegger

Abstract:

The expansion of solar energy as a means to address the climate crisis is undisputed, but the increasing number of new photovoltaic (PV) modules being put on the market is simultaneously leading to increased challenges in terms of managing the growing waste stream. Many of the discarded modules are still fully functional but are often damaged by improper handling after disassembly or not properly tested to be considered for a second life. In addition, the collection rate for dismantled PV modules in several European countries is only a fraction of previous projections, partly due to the increased number of illegal exports. The underlying problem for those market imperfections is an insufficient data exchange between the different actors along the PV value chain, as well as the limited traceability of PV panels during their lifetime. As part of the Horizon 2020 project CIRCUSOL, an asset database prototype was developed to tackle the described problems. In an iterative process applying the design science research methodology, different business models, as well as the technical implementation of the database, were established and evaluated. To explore the requirements of different stakeholders for the development of the database, surveys and in-depth interviews were conducted with various representatives of the solar industry. The proposed database prototype maps the entire value chain of PV modules, beginning with the digital product passport, which provides information about materials and components contained in every module. Product-related information can then be expanded with performance data of existing installations. This information forms the basis for the application of data analysis methods to forecast the appropriate end-of-life strategy, as well as the circular economy potential of PV modules, already before they arrive at the recycling facility. The database prototype could already be enriched with data from different data sources along the value chain. From a business model perspective, the database offers opportunities both in the area of reuse as well as with regard to the certification of sustainable modules. Here, participating actors have the opportunity to differentiate their business and exploit new revenue streams. Future research can apply this approach to further industry and product sectors, validate the database prototype in a practical context, and can serve as a basis for standardization efforts to strengthen the circular economy.

Keywords: business model, circular economy, database, design science research, solar industry

Procedia PDF Downloads 114
24902 Implementation of an IoT Sensor Data Collection and Analysis Library

Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee

Abstract:

Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.

Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data

Procedia PDF Downloads 370
24901 Investigating the Efficiency of Granular Sludge for Recovery of Phosphate from Wastewater

Authors: Sara Salehi, Ka Yu Cheng, Anna Heitz, Maneesha Ginige

Abstract:

This study investigated the efficiency of granular sludge for phosphorous (P) recovery from wastewater. A laboratory scale sequencing batch reactor (SBR) was operated under alternating aerobic/anaerobic conditions to enrich a P accumulating granular biomass. This study showed that an overall 45-fold increase in P concentration could be achieved by reducing the volume of the P capturing liquor by 5-fold in the anaerobic P release phase. Moreover, different fractions of the granular biomass have different individual contributions towards generating a concentrated stream of P.

Keywords: granular sludge, PAOs, P recovery, SBR

Procedia PDF Downloads 474
24900 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review

Procedia PDF Downloads 154
24899 Application of Microbially Induced Calcite Precipitation Technology in Construction Materials: A Comprehensive Review of Waste Stream Contributions

Authors: Amir Sina Fouladi, Arul Arulrajah, Jian Chu, Suksun Horpibulsuk

Abstract:

Waste generation is a growing concern in many countries across the world, particularly in urban areas with high rates of population growth and industrialization. The increasing amount of waste generated from human activities has led to environmental, economic, and health issues. Improper disposal of waste can result in air and water pollution, land degradation, and the spread of diseases. Waste generation also consumes large amounts of natural resources and energy, leading to the depletion of valuable resources and contributing to greenhouse gas emissions. To address these concerns, there is a need for sustainable waste management practices that reduce waste generation and promote resource recovery and recycling. Amongst these, developing innovative technologies such as Microbially Induced Calcite Precipitation (MICP) in construction materials is an effective approach to transforming waste into valuable and sustainable applications. MICP is an environmentally friendly microbial-chemical technology that applies microorganisms and chemical reagents to biological processes to produce carbonate mineral. This substance can be an energy-efficient, cost-effective, sustainable solution to environmental and engineering challenges. Recent research has shown that waste streams can replace several MICP-chemical components in the cultivation media of microorganisms and cementation reagents (calcium sources and urea). In addition to its effectiveness in treating hazardous waste streams, MICP has been found to be cost-effective and sustainable solution applicable to various waste media. This comprehensive review paper aims to provide a thorough understanding of the environmental advantages and engineering applications of MICP technology, with a focus on the contribution of waste streams. It also provides researchers with guidance on how to identify and overcome the challenges that may arise applying the MICP technology using waste streams.

Keywords: waste stream, microbially induced calcite precipitation, construction materials, sustainability

Procedia PDF Downloads 74
24898 In-Situ Synthesis of Zinc-Containing MCM-41 and Investigation of Its Capacity for Removal of Hydrogen Sulfide from Crude Oil

Authors: Nastaran Hazrati, Ali Akbar Miran Beigi, Majid Abdouss, Amir Vahid

Abstract:

Hydrogen sulfide is the most toxic gas of crude oil. Adsorption is an energy-efficient process used to remove undesirable compounds such as H2S in gas or liquid streams by passing the stream through a media bed composed of an adsorbent. In this study, H2S of Iran crude oil was separated via cold stripping then zinc incorporated MCM-41 was synthesized via an in-situ method. ZnO functionalized mesoporous silica samples were characterized by XRD, N2 adsorption and TEM. The obtained results of adsorption of H2S showed superior ability of all the materials and with an increase in ZnO amount adsorption was increased.

Keywords: MCM-41, ZnO, H2S removal, adsorption

Procedia PDF Downloads 456
24897 Bio-Electro Chemical Catalysis: Redox Interactions, Storm and Waste Water Treatment

Authors: Michael Radwan Omary

Abstract:

Context: This scientific innovation demonstrate organic catalysis engineered media effective desalination of surface and groundwater. The author has developed a technology called “Storm-Water Ions Filtration Treatment” (SWIFTTM) cold reactor modules designed to retrofit typical urban street storm drains or catch basins. SWIFT triggers biochemical redox reactions with water stream-embedded toxic total dissolved solids (TDS) and electrical conductivity (EC). SWIFTTM Catalysts media unlock the sub-molecular bond energy, break down toxic chemical bonds, and neutralize toxic molecules, bacteria and pathogens. Research Aim: This research aims to develop and design lower O&M cost, zero-brine discharge, energy input-free, chemical-free water desalination and disinfection systems. The objective is to provide an effective resilient and sustainable solution to urban storm-water and groundwater decontamination and disinfection. Methodology: We focused on the development of organic, non-chemical, no-plugs, no pumping, non-polymer and non-allergenic approaches for water and waste water desalination and disinfection. SWIFT modules operate by directing the water stream to flow freely through the electrically charged media cold reactor, generating weak interactions with a water-dissolved electrically conductive molecule, resulting in the neutralization of toxic molecules. The system is powered by harvesting sub-molecular bonds embedded in energy. Findings: The SWIFTTM Technology case studies at CSU-CI and CSU-Fresno Water Institute, demonstrated consistently high reduction of all 40 detected waste-water pollutants including pathogens to levels below a state of California Department of Water Resources “Drinking Water Maximum Contaminants Levels”. The technology has proved effective in reducing pollutants such as arsenic, beryllium, mercury, selenium, glyphosate, benzene, and E. coli bacteria. The technology has also been successfully applied to the decontamination of dissolved chemicals, water pathogens, organic compounds and radiological agents. Theoretical Importance: SWIFT technology development, design, engineering, and manufacturing, offer cutting-edge advancement in achieving clean-energy source bio-catalysis media solution, an energy input free water and waste water desalination and disinfection. A significant contribution to institutions and municipalities achieving sustainable, lower cost, zero-brine and zero CO2 discharges clean energy water desalination. Data Collection and Analysis Procedures: The researchers collected data on the performance of the SWIFTTM technology in reducing the levels of various pollutants in water. The data was analyzed by comparing the reduction achieved by the SWIFTTM technology to the Drinking Water Maximum Contaminants Levels set by the state of California. The researchers also conducted live oral presentations to showcase the applications of SWIFTTM technology in storm water capture and decontamination as well as providing clean drinking water during emergencies. Conclusion: The SWIFTTM Technology has demonstrated its capability to effectively reduce pollutants in water and waste water to levels below regulatory standards. The Technology offers a sustainable solution to groundwater and storm-water treatments. Further development and implementation of the SWIFTTM Technology have the potential to treat storm water to be reused as a new source of drinking water and an ambient source of clean and healthy local water for recharge of ground water.

Keywords: catalysis, bio electro interactions, water desalination, weak-interactions

Procedia PDF Downloads 60