Search results for: Building energy efficiency
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5691

Search results for: Building energy efficiency

2391 Remediation of Petroleum Hydrocarbon-contaminated Soil Slurry by Fenton Oxidation

Authors: C. Pongcharoen, K. Kaiyavongand T. Satapanajaru

Abstract:

Theobjective of this study was to evaluate the optimal treatment condition of Fenton oxidation process to removal contaminant in soil slurry contaminated by petroleum hydrocarbons. This research studied somefactors that affect the removal efficiency of petroleum hydrocarbons in soil slurry including molar ratio of hydrogen peroxide (H2O2) to ferrous ion(Fe2+), pH condition and reaction time.The resultsdemonstrated that the optimum condition was that the molar ratio of H2O2:Fe3+ was 200:1,the pHwas 4.0and the rate of reaction was increasing rapidly from starting point to 7th hour and destruction kinetic rate (k) was 0.24 h-1. Approximately 96% of petroleum hydrocarbon was observed(initialtotal petroleum hydrocarbon (TPH) concentration = 70±7gkg-1)

Keywords: Contaminated soil, Fenton oxidation, Petroleumhydrocarbon, Remediation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2731
2390 Thermal Properties of the Ground in Cyprus and Their Correlations and Effect on the Efficiency of Ground Heat Exchangers

Authors: G. A. Florides, E. Theofanous, I. Iosif-Stylianou, P. Christodoulides, S. Kalogirou, V. Messarites, Z. Zomeni, E. Tsiolakis, P. D. Pouloupatis, G. P. Panayiotou

Abstract:

Ground Coupled Heat Pumps (GCHPs) exploit effectively the heat capacity of the ground, with the use of Ground Heat Exchangers (GHE). Depending on the mode of operation of the GCHPs, GHEs dissipate or absorb heat from the ground. For sizing the GHE the thermal properties of the ground need to be known. This paper gives information about the density, thermal conductivity, specific heat and thermal diffusivity of various lithologies encountered in Cyprus with various relations between these properties being examined through comparison and modeling. The results show that the most important correlation is the one encountered between thermal conductivity and thermal diffusivity with both properties showing similar response to the inlet and outlet flow temperature of vertical and horizontal heat exchangers.

Keywords: Ground heat exchangers, ground thermal conductivity, ground thermal diffusivity, ground thermal properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1916
2389 Structural Reliability of Existing Structures: A Case Study

Authors: Z. Sakka, I. Assakkaf, T. Al-Yaqoub, J. Parol

Abstract:

reliability-based methodology for the assessment and evaluation of reinforced concrete (R/C) structural elements of concrete structures is presented herein. The results of the reliability analysis and assessment for R/C structural elements were verified by the results obtained through deterministic methods. The outcomes of the reliability-based analysis were compared against currently adopted safety limits that are incorporated in the reliability indices β’s, according to international standards and codes. The methodology is based on probabilistic analysis using reliability concepts and statistics of the main random variables that are relevant to the subject matter, and for which they are to be used in the performance-function equation(s) associated with the structural elements under study. These methodology techniques can result in reliability index β, which is commonly known as the reliability index or reliability measure value that can be utilized to assess and evaluate the safety, human risk, and functionality of the structural component. Also, these methods can result in revised partial safety factor values for certain target reliability indices that can be used for the purpose of redesigning the R/C elements of the building and in which they could assist in considering some other remedial actions to improve the safety and functionality of the member.

Keywords: Concrete Structures, FORM, Monte Carlo Simulation, Structural Reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3093
2388 Effect of Influent COD on Biological Ammonia Removal Efficiency

Authors: S. H. Mirhossaini, H. Godini, A. Jafari

Abstract:

Biological Ammonia removal (nitrification), the oxidation of ammonia to nitrate catalyzed by bacteria, is a key part of global nitrogen cycling. In the first step of nitrification, chemolithoautotrophic ammonia oxidizer transform ammonia to nitrite, this subsequently oxidized to nitrate by nitrite oxidizing bacteria. This process can be affected by several factors. In this study the effect of influent COD on biological ammonia removal in a bench-scale biological reactor was investigated. Experiments were carried out using synthetic wastewater. The initial ammonium concentration was 25mgNH4 +-N L-1. The effect of COD between 247.55±1.8 and 601.08±3.24mgL-1 on biological ammonia removal was investigated by varying the COD loading supplied to reactor. From the results obtained in this study it could be concluded in the range of 247.55±1.8 to 351.35±2.05mgL-1, there is a direct relationship between amount of COD and ammonia removal. However more than 351.35±2.05 up to 601.08±3.24mgL-1 were found an indirect relationship between them.

Keywords: Ammonia biological removal, Nitrification, InfluentCOD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3881
2387 Modular Workflow System for HPC Applications

Authors: Y. Yudin, T. Krasikova, Y. Dorozhko, N. Currle-Linde

Abstract:

Nowadays, HPC, Grid and Cloud systems are evolving very rapidly. However, the development of infrastructure solutions related to HPC is lagging behind. While the existing infrastructure is sufficient for simple cases, many computational problems have more complex requirements.Such computational experiments use different resources simultaneously to start a large number of computational jobs.These resources are heterogeneous. They have different purposes, architectures, performance and used software.Users need a convenient tool that allows to describe and to run complex computational experiments under conditions of HPC environment. This paper introduces a modularworkflow system called SEGL which makes it possible to run complex computational experiments under conditions of a real HPC organization. The system can be used in a great number of organizations, which provide HPC power. Significant requirements to this system are high efficiency and interoperability with the existing HPC infrastructure of the organization without any changes.

Keywords: HPC, Molecular Dynamics, Workflow Languages, Workflow Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
2386 Comparison between Beta Wavelets Neural Networks, RBF Neural Networks and Polynomial Approximation for 1D, 2DFunctions Approximation

Authors: Wajdi Bellil, Chokri Ben Amar, Adel M. Alimi

Abstract:

This paper proposes a comparison between wavelet neural networks (WNN), RBF neural network and polynomial approximation in term of 1-D and 2-D functions approximation. We present a novel wavelet neural network, based on Beta wavelets, for 1-D and 2-D functions approximation. Our purpose is to approximate an unknown function f: Rn - R from scattered samples (xi; y = f(xi)) i=1....n, where first, we have little a priori knowledge on the unknown function f: it lives in some infinite dimensional smooth function space and second the function approximation process is performed iteratively: each new measure on the function (xi; f(xi)) is used to compute a new estimate f as an approximation of the function f. Simulation results are demonstrated to validate the generalization ability and efficiency of the proposed Beta wavelet network.

Keywords: Beta wavelets networks, RBF neural network, training algorithms, MSE, 1-D, 2D function approximation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
2385 The Optimal Design for Grip Force of Material Handling

Authors: V. Tawiwat, S. Sarawut

Abstract:

Applied a mouse-s roller with a gripper to increase the efficiency for a gripper can learn to a material handling without slipping. To apply a gripper, we use the optimize principle to develop material handling by use a signal for checking a roller mouse that rotate or not. In case of the roller rotates means that the material slips. A gripper will slide to material handling until the roller will not rotate. As this experiment has test material handling for comparing a grip force that uses to material handling of the 10-human with the applied gripper. We can summarize that human exert the material handling more than the applied gripper. Because of the gripper can exert more befit to material handling than human and may be a minimum force to lift a material without slipping.

Keywords: Optimize, Gripper, Mouse's Roller, Minimum Force.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554
2384 Estimating Regression Effects in Com Poisson Generalized Linear Model

Authors: Vandna Jowaheer, Naushad A. Mamode Khan

Abstract:

Com Poisson distribution is capable of modeling the count responses irrespective of their mean variance relation and the parameters of this distribution when fitted to a simple cross sectional data can be efficiently estimated using maximum likelihood (ML) method. In the regression setup, however, ML estimation of the parameters of the Com Poisson based generalized linear model is computationally intensive. In this paper, we propose to use quasilikelihood (QL) approach to estimate the effect of the covariates on the Com Poisson counts and investigate the performance of this method with respect to the ML method. QL estimates are consistent and almost as efficient as ML estimates. The simulation studies show that the efficiency loss in the estimation of all the parameters using QL approach as compared to ML approach is quite negligible, whereas QL approach is lesser involving than ML approach.

Keywords: Com Poisson, Cross-sectional, Maximum Likelihood, Quasi likelihood

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
2383 Automatic Building an Extensive Arabic FA Terms Dictionary

Authors: El-Sayed Atlam, Masao Fuketa, Kazuhiro Morita, Jun-ichi Aoe

Abstract:

Field Association (FA) terms are a limited set of discriminating terms that give us the knowledge to identify document fields which are effective in document classification, similar file retrieval and passage retrieval. But the problem lies in the lack of an effective method to extract automatically relevant Arabic FA Terms to build a comprehensive dictionary. Moreover, all previous studies are based on FA terms in English and Japanese, and the extension of FA terms to other language such Arabic could be definitely strengthen further researches. This paper presents a new method to extract, Arabic FA Terms from domain-specific corpora using part-of-speech (POS) pattern rules and corpora comparison. Experimental evaluation is carried out for 14 different fields using 251 MB of domain-specific corpora obtained from Arabic Wikipedia dumps and Alhyah news selected average of 2,825 FA Terms (single and compound) per field. From the experimental results, recall and precision are 84% and 79% respectively. Therefore, this method selects higher number of relevant Arabic FA Terms at high precision and recall.

Keywords: Arabic Field Association Terms, information extraction, document classification, information retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
2382 An In-Depth Analysis of Open Data Portals as an Emerging Public E-Service

Authors: Martin Lnenicka

Abstract:

Governments collect and produce large amounts of data. Increasingly, governments worldwide have started to implement open data initiatives and also launch open data portals to enable the release of these data in open and reusable formats. Therefore, a large number of open data repositories, catalogues and portals have been emerging in the world. The greater availability of interoperable and linkable open government data catalyzes secondary use of such data, so they can be used for building useful applications which leverage their value, allow insight, provide access to government services, and support transparency. The efficient development of successful open data portals makes it necessary to evaluate them systematic, in order to understand them better and assess the various types of value they generate, and identify the required improvements for increasing this value. Thus, the attention of this paper is directed particularly to the field of open data portals. The main aim of this paper is to compare the selected open data portals on the national level using content analysis and propose a new evaluation framework, which further improves the quality of these portals. It also establishes a set of considerations for involving businesses and citizens to create eservices and applications that leverage on the datasets available from these portals.

Keywords: Big data, content analysis, criteria comparison, data quality, open data, open data portals, public sector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3082
2381 Reduction of Impulsive Noise in OFDM System Using Adaptive Algorithm

Authors: Alina Mirza, Sumrin M. Kabir, Shahzad A. Sheikh

Abstract:

The Orthogonal Frequency Division Multiplexing (OFDM) with high data rate, high spectral efficiency and its ability to mitigate the effects of multipath makes them most suitable in wireless application. Impulsive noise distorts the OFDM transmission and therefore methods must be investigated to suppress this noise. In this paper, a State Space Recursive Least Square (SSRLS) algorithm based adaptive impulsive noise suppressor for OFDM communication system is proposed. And a comparison with another adaptive algorithm is conducted. The state space model-dependent recursive parameters of proposed scheme enables to achieve steady state mean squared error (MSE), low bit error rate (BER), and faster convergence than that of some of existing algorithm.

Keywords: OFDM, Impulsive Noise, SSRLS, BER.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2701
2380 Current Developments in Flat-Plate Vacuum Solar Thermal Collectors

Authors: Farid Arya, Trevor Hyde, Paul Henshall, Phillip Eames, Roger Moss, Stan Shire

Abstract:

Vacuum flat plate solar thermal collectors offer several advantages over other collectors namely the excellent optical and thermal characteristics they exhibit due to a combination of their wide surface area and high vacuum thermal insulation. These characteristics can offer a variety of applications for industrial process heat as well as for building integration as they are much thinner than conventional collectors making installation possible in limited spaces. However, many technical challenges which need to be addressed to enable wide scale adoption of the technology still remain. This paper will discuss the challenges, expectations and requirements for the flat-plate vacuum solar collector development. In addition, it will provide an overview of work undertaken in Ulster University, Loughborough University, and the University of Warwick on flat-plate vacuum solar thermal collectors. Finally, this paper will present a detailed experimental investigation on the development of a vacuum panel with a novel sealing method which will be used to accommodate a novel slim hydroformed solar absorber.

Keywords: Hot box calorimeter, infrared thermography, solar thermal collector, vacuum insulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2398
2379 Test Data Compression Using a Hybrid of Bitmask Dictionary and 2n Pattern Runlength Coding Methods

Authors: C. Kalamani, K. Paramasivam

Abstract:

In VLSI, testing plays an important role. Major problem in testing are test data volume and test power. The important solution to reduce test data volume and test time is test data compression. The Proposed technique combines the bit maskdictionary and 2n pattern run length-coding method and provides a substantial improvement in the compression efficiency without introducing any additional decompression penalty. This method has been implemented using Mat lab and HDL Language to reduce test data volume and memory requirements. This method is applied on various benchmark test sets and compared the results with other existing methods. The proposed technique can achieve a compression ratio up to 86%.

Keywords: Bit Mask dictionary, 2n pattern run length code, system-on-chip, SOC, test data compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
2378 Optimization of Slider Crank Mechanism Using Design of Experiments and Multi-Linear Regression

Authors: Galal Elkobrosy, Amr M. Abdelrazek, Bassuny M. Elsouhily, Mohamed E. Khidr

Abstract:

Crank shaft length, connecting rod length, crank angle, engine rpm, cylinder bore, mass of piston and compression ratio are the inputs that can control the performance of the slider crank mechanism and then its efficiency. Several combinations of these seven inputs are used and compared. The throughput engine torque predicted by the simulation is analyzed through two different regression models, with and without interaction terms, developed according to multi-linear regression using LU decomposition to solve system of algebraic equations. These models are validated. A regression model in seven inputs including their interaction terms lowered the polynomial degree from 3rd degree to 1st degree and suggested valid predictions and stable explanations.

Keywords: Design of experiments, regression analysis, SI Engine, statistical modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1252
2377 Designing an Irregular Tensegrity as a Monumental Object

Authors: Buntara Sthenly Gan

Abstract:

A novel and versatile numerical technique to solve a self-stress equilibrium state is adopted herein as a form-finding procedure for an irregular tensegrity structure. The numerical form-finding scheme of a tensegrity structure uses only the connectivity matrix and prototype tension coefficient vector as the initial guess solution. Any information on the symmetrical geometry or other predefined initial structural conditions is not necessary to get the solution in the form-finding process. An eight-node initial condition example is presented to demonstrate the efficiency and robustness of the proposed method in the form-finding of an irregular tensegrity structure. Based on the conception from the form-finding of an eight-node irregular tensegrity structure, a monumental object is designed by considering the real world situation such as self-weight, wind and earthquake loadings.

Keywords: Tensegrity, Form-finding, Design, Irregular, Self-stress, Force density method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
2376 Prediction Compressive Strength of Self-Compacting Concrete Containing Fly Ash Using Fuzzy Logic Inference System

Authors: O. Belalia Douma, B. Boukhatem, M. Ghrici

Abstract:

Self-compacting concrete (SCC) developed in Japan in the late 80s has enabled the construction industry to reduce demand on the resources, improve the work condition and also reduce the impact of environment by elimination of the need for compaction. Fuzzy logic (FL) approaches has recently been used to model some of the human activities in many areas of civil engineering applications. Especially from these systems in the model experimental studies, very good results have been obtained. In the present study, a model for predicting compressive strength of SCC containing various proportions of fly ash, as partial replacement of cement has been developed by using Fuzzy Inference System (FIS). For the purpose of building this model, a database of experimental data were gathered from the literature and used for training and testing the model. The used data as the inputs of fuzzy logic models are arranged in a format of five parameters that cover the total binder content, fly ash replacement percentage, water content, superplasticizer and age of specimens. The training and testing results in the fuzzy logic model have shown a strong potential for predicting the compressive strength of SCC containing fly ash in the considered range.

Keywords: Self-compacting concrete, fly ash, strength prediction, fuzzy logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2851
2375 Design Calculation and Performance Testing of Heating Coil in Induction Surface Hardening Machine

Authors: Soe Sandar Aung, Han Phyo Wai, Nyein Nyein Soe

Abstract:

The induction hardening machines are utilized in the industries which modify machine parts and tools needed to achieve high ware resistance. This paper describes the model of induction heating process design of inverter circuit and the results of induction surface hardening of heating coil. In the design of heating coil, the shape and the turn numbers of the coil are very important design factors because they decide the overall operating performance of induction heater including resonant frequency, Q factor, efficiency and power factor. The performance will be tested by experiments in some cases high frequency induction hardening machine.

Keywords: Induction Heating, Resonant Circuit, InverterCircuit, Coil Design, Induction Hardening Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22283
2374 An Improved Data Mining Method Applied to the Search of Relationship between Metabolic Syndrome and Lifestyles

Authors: Yi Chao Huang, Yu Ling Liao, Chiu Shuang Lin

Abstract:

A data cutting and sorting method (DCSM) is proposed to optimize the performance of data mining. DCSM reduces the calculation time by getting rid of redundant data during the data mining process. In addition, DCSM minimizes the computational units by splitting the database and by sorting data with support counts. In the process of searching for the relationship between metabolic syndrome and lifestyles with the health examination database of an electronics manufacturing company, DCSM demonstrates higher search efficiency than the traditional Apriori algorithm in tests with different support counts.

Keywords: Data mining, Data cutting and sorting method, Apriori algorithm, Metabolic syndrome

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588
2373 A New OvS Approach in an Assembly Line Balancing Problem

Authors: P. Azimi, B. Behtoiy

Abstract:

One of the most famous techniques which affect the efficiency of a production line is the assembly line balancing (ALB) technique. This paper examines the balancing effect of a whole production line of a real auto glass manufacturer in three steps. In the first step, processing time of each activity in the workstations is generated according to a practical approach. In the second step, the whole production process is simulated and the bottleneck stations have been identified, and finally in the third step, several improvement scenarios are generated to optimize the system throughput, and the best one is proposed. The main contribution of the current research is the proposed framework which combines two famous approaches including Assembly Line Balancing and Optimization via Simulation technique (OvS). The results show that the proposed framework could be applied in practical environments, easily.

Keywords: Assembly line balancing problem, optimization via simulation, production planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1806
2372 On One Application of Hybrid Methods For Solving Volterra Integral Equations

Authors: G.Mehdiyeva, V.Ibrahimov, M.Imanova

Abstract:

As is known, one of the priority directions of research works of natural sciences is introduction of applied section of contemporary mathematics as approximate and numerical methods to solving integral equation into practice. We fare with the solving of integral equation while studying many phenomena of nature to whose numerically solving by the methods of quadrature are mainly applied. Taking into account some deficiency of methods of quadrature for finding the solution of integral equation some sciences suggested of the multistep methods with constant coefficients. Unlike these papers, here we consider application of hybrid methods to the numerical solution of Volterra integral equation. The efficiency of the suggested method is proved and a concrete method with accuracy order p = 4 is constructed. This method in more precise than the corresponding known methods.

Keywords: Volterra integral equation, hybrid methods, stability and degree, methods of quadrature

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395
2371 Artificial Intelligence Applications in Aggregate Quarries: A Reality

Authors: J. E. Ortiz, P. Plaza, J. Herrero, I. Cabria, J. L. Blanco, J. Gavilanes, J. I. Escavy, I. López-Cilla, V. Yagüe, C. Pérez, S. Rodríguez, J. Rico, C. Serrano, J. Bernat

Abstract:

The development of Artificial Intelligence services in mining processes, specifically in aggregate quarries, is facilitating automation and improving numerous aspects of operations. Ultimately, AI is transforming the mining industry by improving efficiency, safety and sustainability. With the ability to analyze large amounts of data and make autonomous decisions, AI offers great opportunities to optimize mining operations and maximize the economic and social benefits of this vital industry. Within the framework of the European DIGIECOQUARRY project, various services were developed for the identification of material quality, production estimation, detection of anomalies and prediction of consumption and production automatically with good results.

Keywords: Aggregates, artificial intelligence, automatization, mining operations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32
2370 The Effect of Confinement Shapes on Over-Reinforced HSC Beams

Authors: Ross Jeffry, Muhammad N. S. Hadi

Abstract:

High strength concrete (HSC) provides high strength but lower ductility than normal strength concrete. This low ductility limits the benefit of using HSC in building safe structures. On the other hand, when designing reinforced concrete beams, designers have to limit the amount of tensile reinforcement to prevent the brittle failure of concrete. Therefore the full potential of the use of steel reinforcement can not be achieved. This paper presents the idea of confining concrete in the compression zone so that the HSC will be in a state of triaxial compression, which leads to improvements in strength and ductility. Five beams made of HSC were cast and tested. The cross section of the beams was 200×300 mm, with a length of 4 m and a clear span of 3.6 m subjected to four-point loading, with emphasis placed on the midspan deflection. The first beam served as a reference beam. The remaining beams had different tensile reinforcement and the confinement shapes were changed to gauge their effectiveness in improving the strength and ductility of the beams. The compressive strength of the concrete was 85 MPa and the tensile strength of the steel was 500 MPa and for the stirrups and helixes was 250 MPa. Results of testing the five beams proved that placing helixes with different diameters as a variable parameter in the compression zone of reinforced concrete beams improve their strength and ductility.

Keywords: Confinement, ductility, high strength concrete, reinforced concrete beam.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2239
2369 An Implementation of EURORADIO Protocol for ERTMS Systems

Authors: Gabriele Cecchetti, Anna Lina Ruscelli, Filippo Cugini, Piero Castoldi

Abstract:

European Rail Traffic Management System (ERTMS) is the European reference for interoperable and safer signaling systems to efficiently manage trains running. If implemented, it allows trains cross seamlessly intra-European national borders. ERTMS has defined a secure communication protocol, EURORADIO, based on open communication networks. Its RadioInfill function can improve the reaction of the signaling system to changes in line conditions, avoiding unnecessary braking: its advantages in terms of power saving and travel time has been analyzed. In this paper a software implementation of the EURORADIO protocol with RadioInfill for ERTMS Level 1 using GSM-R is illustrated as part of the SR-Secure Italian project. In this building-blocks architecture the EURORADIO layers communicates together through modular Application Programm Interfaces. Security coding rules and railway industry requirements specified by EN 50128 standard have been respected. The proposed implementation has successfully passed conformity tests and has been tested on a computer-based simulator.

Keywords: ERTMS, ETCS signalling, EURORADIO protocol, radio infill function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4431
2368 Microscopic Analysis of Interfacial Transition Zone of Cementitious Composites Prepared by Various Mixing Procedures

Authors: Josef Fládr, Jiří Němeček, Veronika Koudelková, Petr Bílý

Abstract:

Mechanical parameters of cementitious composites differ quite significantly based on the composition of cement matrix. They are also influenced by mixing times and procedure. The research presented in this paper was aimed at identification of differences in microstructure of normal strength (NSC) and differently mixed high strength (HSC) cementitious composites. Scanning electron microscopy (SEM) investigation together with energy dispersive X-ray spectroscopy (EDX) phase analysis of NSC and HSC samples was conducted. Evaluation of interfacial transition zone (ITZ) between the aggregate and cement matrix was performed. Volume share, thickness, porosity and composition of ITZ were studied. In case of HSC, samples obtained by several different mixing procedures were compared in order to find the most suitable procedure. In case of NSC, ITZ was identified around 40-50% of aggregate grains and its thickness typically ranged between 10 and 40 µm. Higher porosity and lower share of clinker was observed in this area as a result of increased water-to-cement ratio (w/c) and the lack of fine particles improving the grading curve of the aggregate. Typical ITZ with lower content of Ca was observed only in one HSC sample, where it was developed around less than 15% of aggregate grains. The typical thickness of ITZ in this sample was similar to ITZ in NSC (between 5 and 40 µm). In the remaining four HSC samples, no ITZ was observed. In general, the share of ITZ in HSC samples was found to be significantly smaller than in NSC samples. As ITZ is the weakest part of the material, this result explains to large extent the improved mechanical properties of HSC compared to NSC. Based on the comparison of characteristics of ITZ in HSC samples prepared by different mixing procedures, the most suitable mixing procedure from the point of view of properties of ITZ was identified.

Keywords: Energy dispersive X-ray spectroscopy, high strength concrete, interfacial transition zone, mixing procedure, normal strength concrete, scanning electron microscopy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1275
2367 Choosing Search Algorithms in Bayesian Optimization Algorithm

Authors: Hao Wu, Jonathan L. Shapiro

Abstract:

The Bayesian Optimization Algorithm (BOA) is an algorithm based on the estimation of distributions. It uses techniques from modeling data by Bayesian networks to estimating the joint distribution of promising solutions. To obtain the structure of Bayesian network, different search algorithms can be used. The key point that BOA addresses is whether the constructed Bayesian network could generate new and useful solutions (strings), which could lead the algorithm in the right direction to solve the problem. Undoubtedly, this ability is a crucial factor of the efficiency of BOA. Varied search algorithms can be used in BOA, but their performances are different. For choosing better ones, certain suitable method to present their ability difference is needed. In this paper, a greedy search algorithm and a stochastic search algorithm are used in BOA to solve certain optimization problem. A method using Kullback-Leibler (KL) Divergence to reflect their difference is described.

Keywords: Bayesian optimization algorithm, greedy search, KL divergence, stochastic search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
2366 Producing Sustained Renewable Energy and Removing Organic Pollutants from Distillery Wastewater using Consortium of Sludge Microbes

Authors: Anubha Kaushik, Raman Preet

Abstract:

Distillery wastewater in the form of spent wash is a complex and strong industrial effluent, with high load of organic pollutants that may deplete dissolved oxygen on being discharged into aquatic systems and contaminate groundwater by leaching of pollutants, while untreated spent wash disposed on land acidifies the soil. Stringent legislative measures have therefore been framed in different countries for discharge standards of distillery effluent. Utilising the organic pollutants present in various types of wastes as food by mixed microbial populations is emerging as an eco-friendly approach in the recent years, in which complex organic matter is converted into simpler forms, and simultaneously useful gases are produced as renewable and clean energy sources. In the present study, wastewater from a rice bran based distillery has been used as the substrate in a dark fermenter, and native microbial consortium from the digester sludge has been used as the inoculum to treat the wastewater and produce hydrogen. After optimising the operational conditions in batch reactors, sequential batch mode and continuous flow stirred tank reactors were used to study the best operational conditions for enhanced and sustained hydrogen production and removal of pollutants. Since the rate of hydrogen production by the microbial consortium during dark fermentation is influenced by concentration of organic matter, pH and temperature, these operational conditions were optimised in batch mode studies. Maximum hydrogen production rate (347.87ml/L/d) was attained in 32h dark fermentation while a good proportion of COD also got removed from the wastewater. Slightly acidic initial pH seemed to favor biohydrogen production. In continuous stirred tank reactor, high H2 production from distillery wastewater was obtained from a relatively shorter substrate retention time (SRT) of 48h and a moderate organic loading rate (OLR) of 172 g/l/d COD.

Keywords: Distillery wastewater, hydrogen, microbial consortium, organic pollution, sludge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 937
2365 Optimal Controller with Backstepping and BELBIC for Single-Link Flexible Manipulator

Authors: Ali Reza Sahab, Amir Gholami Pastaki

Abstract:

In this paper, backstepping method (BM) is proposed for a single-link flexible mechanical manipulator. In each step of this method a positive value is obtained. Selections of the gain factor values are very important because controller will have different behavior for each different set of values. Improper selection of these gains can lead to instability of the system. In order to choose proper values for gains BELBIC method has been used in this work. Finally, to prove the efficiency of this method, the obtained results of proposed model are compared with robust controller one. Results show that the combination of backstepping and BELBIC that is presented here, can stabilized the system with higher speed, shorter settling time and lower overshoot in than robust controller.

Keywords: single-link flexible manipulator, backstepping, BELBIC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1875
2364 Starting Characteristic Analysis of LSPM for Pumping System Considering Demagnetization

Authors: Subrato Saha, Yun-Hyun Cho

Abstract:

This paper presents the design process of a high performance 3-phase 3.7 kW 2-pole line start permanent magnet synchronous motor for pumping system. A method was proposed to study the starting torque characteristics considering line start with high inertia load. A d-q model including cage was built to study the synchronization capability. Time-stepping finite element method analysis was utilized to accurately predict the dynamic and transient performance, efficiency, starting current, speed curve and etc. Considering the load torque of pumps during starting stage, the rotor bar was designed with minimum demagnetization of permanent magnet caused by huge starting current.

Keywords: LSPM, starting analysis, demagnetization, FEA, pumping system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2255
2363 Fast Database Indexing for Large Protein Sequence Collections Using Parallel N-Gram Transformation Algorithm

Authors: Jehad A. H. Hammad, Nur'Aini binti Abdul Rashid

Abstract:

With the rapid development in the field of life sciences and the flooding of genomic information, the need for faster and scalable searching methods has become urgent. One of the approaches that were investigated is indexing. The indexing methods have been categorized into three categories which are the lengthbased index algorithms, transformation-based algorithms and mixed techniques-based algorithms. In this research, we focused on the transformation based methods. We embedded the N-gram method into the transformation-based method to build an inverted index table. We then applied the parallel methods to speed up the index building time and to reduce the overall retrieval time when querying the genomic database. Our experiments show that the use of N-Gram transformation algorithm is an economical solution; it saves time and space too. The result shows that the size of the index is smaller than the size of the dataset when the size of N-Gram is 5 and 6. The parallel N-Gram transformation algorithm-s results indicate that the uses of parallel programming with large dataset are promising which can be improved further.

Keywords: Biological sequence, Database index, N-gram indexing, Parallel computing, Sequence retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2136
2362 Enzyme Involvement in the Biosynthesis of Selenium Nanoparticles by Geobacillus wiegelii Strain GWE1 Isolated from a Drying Oven

Authors: Daniela N. Correa-Llantén, Sebastián A. Muñoz-Ibacache, Mathilde Maire, Jenny M. Blamey

Abstract:

The biosynthesis of nanoparticles by microorganisms, on the contrary to chemical synthesis, is an environmentally-friendly process which has low energy requirements. In this investigation, we used the microorganism Geobacillus wiegelii, strain GWE1, an aerobic thermophile belonging to genus Geobacillus, isolated from a drying oven. This microorganism has the ability to reduce selenite evidenced by the change of color from colorless to red in the culture. Elemental analysis and composition of the particles were verified using transmission electron microscopy and energy-dispersive X-ray analysis. The nanoparticles have a defined spherical shape and a selenium elemental state. Previous experiments showed that the presence of the whole microorganism for the reduction of selenite was not necessary. The results strongly suggested that an intracellular NADPH/NADH-dependent reductase mediates selenium nanoparticles synthesis under aerobic conditions. The enzyme was purified and identified by mass spectroscopy MALDI-TOF TOF technique. The enzyme is a 1-pyrroline-5-carboxylate dehydrogenase. Histograms of nanoparticles sizes were obtained. Size distribution ranged from 40-160 nm, where 70% of nanoparticles have less than 100 nm in size. Spectroscopic analysis showed that the nanoparticles are composed of elemental selenium. To analyse the effect of pH in size and morphology of nanoparticles, the synthesis of them was carried out at different pHs (4.0, 5.0, 6.0, 7.0, 8.0). For thermostability studies samples were incubated at different temperatures (60, 80 and 100 ºC) for 1 h and 3 h. The size of all nanoparticles was less than 100 nm at pH 4.0; over 50% of nanoparticles have less than 100 nm at pH 5.0; at pH 6.0 and 8.0 over 90% of nanoparticles have less than 100 nm in size. At neutral pH (7.0) nanoparticles reach a size around 120 nm and only 20% of them were less than 100 nm. When looking at temperature effect, nanoparticles did not show a significant difference in size when they were incubated between 0 and 3 h at 60 ºC. Meanwhile at 80 °C the nanoparticles suspension lost its homogeneity. A change in size was observed from 0 h of incubation at 80ºC, observing a size range between 40-160 nm, with 20% of them over 100 nm. Meanwhile after 3 h of incubation at size range changed to 60-180 nm with 50% of them over 100 nm. At 100 °C the nanoparticles aggregate forming nanorod structures. In conclusion, these results indicate that is possible to modulate size and shape of biologically synthesized nanoparticles by modulating pH and temperature.

Keywords: Genus Geobacillus, NADPH/NADH-dependent reductase, Selenium nanoparticles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2308