Search results for: network analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10737

Search results for: network analysis

7437 Future Logistics - Challenges, Requirements and Solutions for Logistics Networks

Authors: Martin Roth, Axel Klarmann, Bogdan Franczyk

Abstract:

The importance of logistics has changed enormously in the last few decades. While logistics was formerly one of the core functions of most companies, logistics or at least parts of their functions are nowadays outsourced to external logistic service providers in terms of contracts. As a result of this shift new business models like the fourth party logistics provider emerged, which designs, plans and monitors the resulting logistics networks. This new business model and topics such as Synchromodality or Big Data impose new requirements on the underlying IT, which cannot be met with conventional concepts and approaches. In this paper, the challenges of logistics network monitoring are outlined by using a scenario. The most common layers in a logical multilayered architecture for an information system are used to point out the arising challenges for IT. In addition, first appropriate solution approaches are introduced.

 

Keywords: Complex Event Processing, Fourth Party Logistics Service Provider, Logistics monitoring, Synchromodality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3368
7436 The Design of the Multi-Agent Classification System (MACS)

Authors: Mohamed R. Mhereeg

Abstract:

The paper discusses the design of a .NET Windows Service based agent system called MACS (Multi-Agent Classification System). MACS is a system aims to accurately classify spreadsheet developers competency over a network. It is designed to automatically and autonomously monitor spreadsheet users and gather their development activities based on the utilization of the software multi-agent technology (MAS). This is accomplished in such a way that makes management capable to efficiently allow for precise tailor training activities for future spreadsheet development. The monitoring agents of MACS are intended to be distributed over the WWW in order to satisfy the monitoring and classification of the multiple developer aspect. The Prometheus methodology is used for the design of the agents of MACS. Prometheus has been used to undertake this phase of the system design because it is developed specifically for specifying and designing agent-oriented systems. Additionally, Prometheus specifies also the communication needed between the agents in order to coordinate to achieve their delegated tasks.

Keywords: Classification, Design, MACS, MAS, Prometheus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677
7435 Functional Near Infrared Spectroscope for Cognition Brain Tasks by Wavelets Analysis and Neural Networks

Authors: Truong Quang Dang Khoa, Masahiro Nakagawa

Abstract:

Brain Computer Interface (BCI) has been recently increased in research. Functional Near Infrared Spectroscope (fNIRs) is one the latest technologies which utilize light in the near-infrared range to determine brain activities. Because near infrared technology allows design of safe, portable, wearable, non-invasive and wireless qualities monitoring systems, fNIRs monitoring of brain hemodynamics can be value in helping to understand brain tasks. In this paper, we present results of fNIRs signal analysis indicating that there exist distinct patterns of hemodynamic responses which recognize brain tasks toward developing a BCI. We applied two different mathematics tools separately, Wavelets analysis for preprocessing as signal filters and feature extractions and Neural networks for cognition brain tasks as a classification module. We also discuss and compare with other methods while our proposals perform better with an average accuracy of 99.9% for classification.

Keywords: functional near infrared spectroscope (fNIRs), braincomputer interface (BCI), wavelets, neural networks, brain activity, neuroimaging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028
7434 Uncertainty Analysis of a Hardware in Loop Setup for Testing Products Related to Building Technology

Authors: Balasundaram Prasaant, Ploix Stephane, Delinchant Benoit, Muresan Cristian

Abstract:

Hardware in Loop (HIL) testing is done to test and validate a particular product especially in building technology. When it comes to building technology, it is more important to test the products for their efficiency. The test rig in the HIL simulator may contribute to some uncertainties on measured efficiency. The uncertainties include physical uncertainties and scenario-based uncertainties. In this paper, a simple uncertainty analysis framework for an HIL setup is shown considering only the physical uncertainties. The entire modeling of the HIL setup is done in Dymola. The uncertain sources are considered based on available knowledge of the components and also on expert knowledge. For the propagation of uncertainty, Monte Carlo Simulation is used since it is the most reliable and easy to use. In this article it is shown how an HIL setup can be modeled and how uncertainty propagation can be performed on it. Such an approach is not common in building energy analysis.

Keywords: Energy in Buildings, Hardware in Loop, Modelica (Dymola), Monte Carlo Simulation, Uncertainty Propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 558
7433 Optimal Combination for Modal Pushover Analysis by Using Genetic Algorithm

Authors: K. Shakeri, M. Mohebbi

Abstract:

In order to consider the effects of the higher modes in the pushover analysis, during the recent years several multi-modal pushover procedures have been presented. In these methods the response of the considered modes are combined by the square-rootof- sum-of-squares (SRSS) rule while application of the elastic modal combination rules in the inelastic phases is no longer valid. In this research the feasibility of defining an efficient alternative combination method is investigated. Two steel moment-frame buildings denoted SAC-9 and SAC-20 under ten earthquake records are considered. The nonlinear responses of the structures are estimated by the directed algebraic combination of the weighted responses of the separate modes. The weight of the each mode is defined so that the resulted response of the combination has a minimum error to the nonlinear time history analysis. The genetic algorithm (GA) is used to minimize the error and optimize the weight factors. The obtained optimal factors for each mode in different cases are compared together to find unique appropriate weight factors for each mode in all cases.

Keywords: Genetic Algorithm, Modal Pushover, Optimalweight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1794
7432 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage

Authors: Oh Hyeon Jeon, WooYoung Jung

Abstract:

In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.

Keywords: Weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo Simulation, permeability coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1150
7431 Genetic Programming Approach for Multi-Category Pattern Classification Appliedto Network Intrusions Detection

Authors: K.M. Faraoun, A. Boukelif

Abstract:

This paper describes a new approach of classification using genetic programming. The proposed technique consists of genetically coevolving a population of non-linear transformations on the input data to be classified, and map them to a new space with a reduced dimension, in order to get a maximum inter-classes discrimination. The classification of new samples is then performed on the transformed data, and so become much easier. Contrary to the existing GP-classification techniques, the proposed one use a dynamic repartition of the transformed data in separated intervals, the efficacy of a given intervals repartition is handled by the fitness criterion, with a maximum classes discrimination. Experiments were first performed using the Fisher-s Iris dataset, and then, the KDD-99 Cup dataset was used to study the intrusion detection and classification problem. Obtained results demonstrate that the proposed genetic approach outperform the existing GP-classification methods [1],[2] and [3], and give a very accepted results compared to other existing techniques proposed in [4],[5],[6],[7] and [8].

Keywords: Genetic programming, patterns classification, intrusion detection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1704
7430 An Artificial Intelligent Technique for Robust Digital Watermarking in Multiwavelet Domain

Authors: P. Kumsawat, K. Pasitwilitham, K. Attakitmongcol, A. Srikaew

Abstract:

In this paper, an artificial intelligent technique for robust digital image watermarking in multiwavelet domain is proposed. The embedding technique is based on the quantization index modulation technique and the watermark extraction process does not require the original image. We have developed an optimization technique using the genetic algorithms to search for optimal quantization steps to improve the quality of watermarked image and robustness of the watermark. In addition, we construct a prediction model based on image moments and back propagation neural network to correct an attacked image geometrically before the watermark extraction process begins. The experimental results show that the proposed watermarking algorithm yields watermarked image with good imperceptibility and very robust watermark against various image processing attacks.

Keywords: Watermarking, Multiwavelet, Quantization index modulation, Genetic algorithms, Neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2086
7429 Evaluation and Analysis of the Secure E-Voting Authentication Preparation Scheme

Authors: Nidal F. Shilbayeh, Reem A. Al-Saidi, Ahmed H. Alsswey

Abstract:

In this paper, we presented an evaluation and analysis of E-Voting Authentication Preparation Scheme (EV-APS). EV-APS applies some modified security aspects that enhance the security measures and adds a strong wall of protection, confidentiality, non-repudiation and authentication requirements. Some of these modified security aspects are Kerberos authentication protocol, PVID scheme, responder certificate validation, and the converted Ferguson e-cash protocol. Authentication and privacy requirements have been evaluated and proved. Authentication guaranteed only eligible and authorized voters were permitted to vote. Also, the privacy guaranteed that all votes will be kept secret. Evaluation and analysis of some of these security requirements have been given. These modified aspects will help in filtering the counter buffer from unauthorized votes by ensuring that only authorized voters are permitted to vote.

Keywords: E-Voting preparation stage, blind signature protocol, nonce based authentication scheme, Kerberos authentication protocol, pseudo voter identity scheme PVID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608
7428 Analysis of Possible Causes of Fukushima Disaster

Authors: Abid Hossain Khan, Syam Hasan, M. A. R. Sarkar

Abstract:

Fukushima disaster is one of the most publicly exposed accidents in a nuclear facility which has changed the outlook of people towards nuclear power. Some have used it as an example to establish nuclear energy as an unsafe source, while others have tried to find the real reasons behind this accident. Many papers have tried to shed light on the possible causes, some of which are purely based on assumptions while others rely on rigorous data analysis. To our best knowledge, none of the works can say with absolute certainty that there is a single prominent reason that has paved the way to this unexpected incident. This paper attempts to compile all the apparent reasons behind Fukushima disaster and tries to analyze and identify the most likely one.

Keywords: Fuel meltdown, Fukushima disaster, manmade calamity, nuclear facility, tsunami.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2170
7427 Multicast Optimization Techniques using Best Effort Genetic Algorithms

Authors: Dinesh Kumar, Y. S. Brar, V. K. Banga

Abstract:

Multicast Network Technology has pervaded our lives-a few examples of the Networking Techniques and also for the improvement of various routing devices we use. As we know the Multicast Data is a technology offers many applications to the user such as high speed voice, high speed data services, which is presently dominated by the Normal networking and the cable system and digital subscriber line (DSL) technologies. Advantages of Multi cast Broadcast such as over other routing techniques. Usually QoS (Quality of Service) Guarantees are required in most of Multicast applications. The bandwidth-delay constrained optimization and we use a multi objective model and routing approach based on genetic algorithm that optimizes multiple QoS parameters simultaneously. The proposed approach is non-dominated routes and the performance with high efficiency of GA. Its betterment and high optimization has been verified. We have also introduced and correlate the result of multicast GA with the Broadband wireless to minimize the delay in the path.

Keywords: GA (genetic Algorithms), Quality of Service, MOGA, Steiner Tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550
7426 Stability Analysis of Impulsive Stochastic Fuzzy Cellular Neural Networks with Time-varying Delays and Reaction-diffusion Terms

Authors: Xinhua Zhang, Kelin Li

Abstract:

In this paper, the problem of stability analysis for a class of impulsive stochastic fuzzy neural networks with timevarying delays and reaction-diffusion is considered. By utilizing suitable Lyapunov-Krasovskii funcational, the inequality technique and stochastic analysis technique, some sufficient conditions ensuring global exponential stability of equilibrium point for impulsive stochastic fuzzy cellular neural networks with time-varying delays and diffusion are obtained. In particular, the estimate of the exponential convergence rate is also provided, which depends on system parameters, diffusion effect and impulsive disturbed intention. It is believed that these results are significant and useful for the design and applications of fuzzy neural networks. An example is given to show the effectiveness of the obtained results.

Keywords: Exponential stability, stochastic fuzzy cellular neural networks, time-varying delays, impulses, reaction-diffusion terms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367
7425 Visualization of Quantitative Thresholds in Stocks

Authors: Siddhant Sahu, P. James Daniel Paul

Abstract:

Technical analysis comprised by various technical indicators is a holistic way of representing price movement of stocks in the market. Various forms of indicators have evolved from the primitive ones in the past decades. There have been many attempts to introduce volume as a major determinant to determine strong patterns in market forecasting. The law of demand defines the relationship between the volume and price. Most of the traders are familiar with the volume game. Including the time dimension to the law of demand provides a different visualization to the theory. While attempting the same, it was found that there are different thresholds in the market for different companies. These thresholds have a significant influence on the price. This article is an attempt in determining the thresholds for companies using the three dimensional graphs for optimizing the portfolios. It also emphasizes on the magnitude of importance of volumes as a key factor for determining of predicting strong price movements, bullish and bearish markets. It uses a comprehensive data set of major companies which form a major chunk of the Indian automotive sector and are thus used as an illustration.

Keywords: Technical Analysis, Expert System, Law of demand, Stocks, Portfolio Analysis, Indian Automotive Sector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2082
7424 Structural Characteristics of Batch Processed Agro-Waste Fibres

Authors: E. I. Akpan, S. O. Adeosun, G. I. Lawal, S. A. Balogun, X. D. Chen

Abstract:

The characterisation of agro-wastes fibres for composite applications from Nigeria using X-ray diffraction (XRD) and Scanning Electron Microscopy (SEM) has been done. Fibres extracted from groundnut shell, coconut husk, rice husk, palm fruit bunch and palm fruit stalk are processed using two novel cellulose fibre production methods developed by the authors. Cellulose apparent crystallinity calculated using the deconvolution of the diffractometer trace shows that the amorphous portion of cellulose was permeable to hydrolysis yielding high crystallinity after treatment. All diffratograms show typical cellulose structure with well-defined 110, 200 and 040 peaks. Palm fruit fibres had the highest 200 crystalline cellulose peaks compared to others and it is an indication of rich cellulose content. Surface examination of the resulting fibres using SEM indicates the presence of regular cellulose network structure with some agglomerated laminated layer of thin leaves of cellulose microfibrils. The surfaces were relatively smooth indicating the removal of hemicellulose, lignin and pectin.

Keywords: X-ray diffraction, SEM, cellulose, deconvolution, crystallinity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2723
7423 Knowledge Based Wear Particle Analysis

Authors: Mohammad S. Laghari, Qurban A. Memon, Gulzar A. Khuwaja

Abstract:

The paper describes a knowledge based system for analysis of microscopic wear particles. Wear particles contained in lubricating oil carry important information concerning machine condition, in particular the state of wear. Experts (Tribologists) in the field extract this information to monitor the operation of the machine and ensure safety, efficiency, quality, productivity, and economy of operation. This procedure is not always objective and it can also be expensive. The aim is to classify these particles according to their morphological attributes of size, shape, edge detail, thickness ratio, color, and texture, and by using this classification thereby predict wear failure modes in engines and other machinery. The attribute knowledge links human expertise to the devised Knowledge Based Wear Particle Analysis System (KBWPAS). The system provides an automated and systematic approach to wear particle identification which is linked directly to wear processes and modes that occur in machinery. This brings consistency in wear judgment prediction which leads to standardization and also less dependence on Tribologists.

Keywords: Computer vision, knowledge based systems, morphology, wear particles.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735
7422 Implementing an Adaptive Behavior for Spread Spectrum Watermarking Procedures

Authors: Franco Frattolillo

Abstract:

The advances in multimedia and networking technologies have created opportunities for Internet pirates, who can easily copy multimedia contents and illegally distribute them on the Internet, thus violating the legal rights of content owners. This paper describes how a simple and well-known watermarking procedure based on a spread spectrum method and a watermark recovery by correlation can be improved to effectively and adaptively protect MPEG-2 videos distributed on the Internet. In fact, the procedure, in its simplest form, is vulnerable to a variety of attacks. However, its security and robustness have been increased, and its behavior has been made adaptive with respect to the video terminals used to open the videos and the network transactions carried out to deliver them to buyers. In fact, such an adaptive behavior enables the proposed procedure to efficiently embed watermarks, and this characteristic makes the procedure well suited to be exploited in web contexts, where watermarks usually generated from fingerprinting codes have to be inserted into the distributed videos “on the fly", i.e. during the purchase web transactions.

Keywords: Copyright protection, digital watermarking, intellectualproperty protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1374
7421 Physicochemical Characterization of Waste from Vegetal Extracts Industry for Use as Briquettes

Authors: Maíra O. Palm, Cintia Marangoni, Ozair Souza, Noeli Sellin

Abstract:

Wastes from a vegetal extracts industry (cocoa, oak, Guarana and mate) were characterized by particle size, proximate and ultimate analysis, lignocellulosic fractions, high heating value, thermal analysis (Thermogravimetric analysis – TGA, and Differential thermal analysis - DTA) and energy density to evaluate their potential as biomass in the form of briquettes for power generation. All wastes presented adequate particle sizes to briquettes production. The wastes showed high moisture content, requiring previous drying for use as briquettes. Cocoa and oak wastes had the highest volatile matter contents with maximum mass loss at 310 ºC and 450 ºC, respectively. The solvents used in the aroma extraction process influenced in the moisture content of the wastes, which was higher for mate due to water has been used as solvent. All wastes showed an insignificant loss mass after 565 °C, hence resulting in low ash content. High carbon and hydrogen contents and low sulfur and nitrogen contents were observed ensuring a low generation of sulfur and nitrous oxides. Mate and cocoa exhibited the highest carbon and lignin content, and high heating value. The dried wastes had high heating value, from 17.1 MJ/kg to 20.8 MJ/kg. The results indicate the energy potential of wastes for use as fuel in power generation.

Keywords: Agro-industrial waste, biomass, briquettes, combustion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1025
7420 Selecting an Advanced Creep Model or a Sophisticated Time-Integration? A New Approach by Means of Sensitivity Analysis

Authors: Holger Keitel

Abstract:

The prediction of long-term deformations of concrete and reinforced concrete structures has been a field of extensive research and several different creep models have been developed so far. Most of the models were developed for constant concrete stresses, thus, in case of varying stresses a specific superposition principle or time-integration, respectively, is necessary. Nowadays, when modeling concrete creep the engineering focus is rather on the application of sophisticated time-integration methods than choosing the more appropriate creep model. For this reason, this paper presents a method to quantify the uncertainties of creep prediction originating from the selection of creep models or from the time-integration methods. By adapting variance based global sensitivity analysis, a methodology is developed to quantify the influence of creep model selection or choice of time-integration method. Applying the developed method, general recommendations how to model creep behavior for varying stresses are given.

Keywords: Concrete creep models, time-integration methods, sensitivity analysis, prediction uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
7419 Measurement and Analysis of Temperature Effects on Box Girders of Continuous Rigid Frame Bridges

Authors: Bugao Wang, Weifeng Wang, Xianwei Zeng

Abstract:

Researches on the general rules of temperature field changing and their effects on the bridge in construction are necessary. This paper investigated the rules of temperature field changing and its effects on bridge using onsite measurement and computational analysis. Guanyinsha Bridge was used as a case study in this research. The temperature field was simulated in analyses. The effects of certain boundary conditions such as sun radiance, wind speed, and model parameters such as heat factor and specific heat on temperature field are investigated. Recommended values for these parameters are proposed. The simulated temperature field matches the measured observations with high accuracy. At the same time, the stresses and deflections of the bridge computed with the simulated temperature field matches measured values too. As a conclusion, the temperature effect analysis of reinforced concrete box girder can be conducted directly based on the reliable weather data of the concerned area.

Keywords: continuous rigid frame bridge, temperature effectanalysis, temperature field, temperature field simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2567
7418 A Numerical Approach for Static and Dynamic Analysis of Deformable Journal Bearings

Authors: D. Benasciutti, M. Gallina, M. Gh. Munteanu, F. Flumian

Abstract:

This paper presents a numerical approach for the static and dynamic analysis of hydrodynamic radial journal bearings. In the first part, the effect of shaft and housing deformability on pressure distribution within oil film is investigated. An iterative algorithm that couples Reynolds equation with a plane finite elements (FE) structural model is solved. Viscosity-to-pressure dependency (Vogel- Barus equation) is also included. The deformed lubrication gap and the overall stress state are obtained. Numerical results are presented with reference to a typical journal bearing configuration at two different inlet oil temperatures. Obtained results show the great influence of bearing components structural deformation on oil pressure distribution, compared with results for ideally rigid components. In the second part, a numerical approach based on perturbation method is used to compute stiffness and damping matrices, which characterize the journal bearing dynamic behavior.

Keywords: Journal bearing, finite elements, deformation, dynamic analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2025
7417 Optimal Placement and Sizing of Distributed Generation in Microgrid for Power Loss Reduction and Voltage Profile Improvement

Authors: Ferinar Moaidi, Mahdi Moaidi

Abstract:

Environmental issues and the ever-increasing in demand of electrical energy make it necessary to have distributed generation (DG) resources in the power system. In this research, in order to realize the goals of reducing losses and improving the voltage profile in a microgrid, the allocation and sizing of DGs have been used. The proposed Genetic Algorithm (GA) is described from the array of artificial intelligence methods for solving the problem. The algorithm is implemented on the IEEE 33 buses network. This study is presented in two scenarios, primarily to illustrate the effect of location and determination of DGs has been done to reduce losses and improve the voltage profile. On the other hand, decisions made with the one-level assumptions of load are not universally accepted for all levels of load. Therefore, in this study, load modelling is performed and the results are presented for multi-levels load state.

Keywords: Distributed generation, genetic algorithm, microgrid, load modelling, loss reduction, voltage improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1045
7416 Quantifying and Adjusting the Effects of Publication Bias in Continuous Meta-Analysis

Authors: N.R.N. Idris

Abstract:

This study uses simulated meta-analysis to assess the effects of publication bias on meta-analysis estimates and to evaluate the efficacy of the trim and fill method in adjusting for these biases. The estimated effect sizes and the standard error were evaluated in terms of the statistical bias and the coverage probability. The results demonstrate that if publication bias is not adjusted it could lead to up to 40% bias in the treatment effect estimates. Utilization of the trim and fill method could reduce the bias in the overall estimate by more than half. The method is optimum in presence of moderate underlying bias but has minimal effects in presence of low and severe publication bias. Additionally, the trim and fill method improves the coverage probability by more than half when subjected to the same level of publication bias as those of the unadjusted data. The method however tends to produce false positive results and will incorrectly adjust the data for publication bias up to 45 % of the time. Nonetheless, the bias introduced into the estimates due to this adjustment is minimal

Keywords: Publication bias, Trim and Fill method, percentage relative bias, coverage probability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
7415 Searching the Efficient Frontier for the Coherent Covering Location Problem

Authors: Felipe Azocar Simonet, Luis Acosta Espejo

Abstract:

In this article, we will try to find an efficient boundary approximation for the bi-objective location problem with coherent coverage for two levels of hierarchy (CCLP). We present the mathematical formulation of the model used. Supported efficient solutions and unsupported efficient solutions are obtained by solving the bi-objective combinatorial problem through the weights method using a Lagrangean heuristic. Subsequently, the results are validated through the DEA analysis with the GEM index (Global efficiency measurement).

Keywords: Coherent covering location problem, efficient frontier, Lagrangian relaxation, data envelopment analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 801
7414 A Novel Approach for Tracking of a Mobile Node Based on Particle Filter and Trilateration

Authors: Muhammad Haroon Siddiqui, Muhammad Rehan Khalid

Abstract:

This paper evaluates the performance of a novel algorithm for tracking of a mobile node, interms of execution time and root mean square error (RMSE). Particle Filter algorithm is used to track the mobile node, however a new technique in particle filter algorithm is also proposed to reduce the execution time. The stationary points were calculated through trilateration and finally by averaging the number of points collected for a specific time, whereas tracking is done through trilateration as well as particle filter algorithm. Wi-Fi signal is used to get initial guess of the position of mobile node in x-y coordinates system. Commercially available software “Wireless Mon" was used to read the WiFi signal strength from the WiFi card. Visual Cµ version 6 was used to interact with this software to read only the required data from the log-file generated by “Wireless Mon" software. Results are evaluated through mathematical modeling and MATLAB simulation.

Keywords: Particle Filter, Tracking, Wireless Local Area Network, WiFi, Trilateration

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061
7413 Seismic Assessment of an Existing Dual System RC Buildings in Madinah City

Authors: Tarek M. Alguhane, Ayman H. Khalil, M. N. Fayed, Ayman M. Ismail

Abstract:

A 15-storey RC building, studied in this paper, is representative of modern building type constructed in Madina City in Saudi Arabia before 10 years ago. These buildings are almost consisting of reinforced concrete skeleton i.e. columns, beams and flat slab as well as shear walls in the stairs and elevator areas arranged in the way to have a resistance system for lateral loads (wind – earthquake loads). In this study, the dynamic properties of the 15-storey RC building were identified using ambient motions recorded at several, spatially-distributed locations within each building. Three dimensional pushover analysis (Nonlinear static analysis) was carried out using SAP2000 software incorporating inelastic material properties for concrete, infill and steel. The effect of modeling the building with and without infill walls, on the performance point as well as capacity and demand spectra due to EQ design spectrum function in Madina area has been investigated. ATC- 40 capacity and demand spectra are utilized to get the modification factor (R) for the studied building. The purpose of this analysis is to evaluate the expected performance of structural systems by estimating, strength and deformation demands in design, and comparing these demands to available capacities at the performance levels of interest. The results are summarized and discussed.

Keywords: Seismic assessment, pushover analysis, ambient vibration, modal update.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1892
7412 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach

Authors: Elias K. Maragos, Petros E. Maravelakis

Abstract:

In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.

Keywords: Data envelopment analysis, Dynamic DEA, Piecewise linear inputs, Piecewise linear outputs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 649
7411 Secure Data Aggregation Using Clusters in Sensor Networks

Authors: Prakash G L, Thejaswini M, S H Manjula, K R Venugopal, L M Patnaik

Abstract:

Wireless sensor network can be applied to both abominable and military environments. A primary goal in the design of wireless sensor networks is lifetime maximization, constrained by the energy capacity of batteries. One well-known method to reduce energy consumption in such networks is data aggregation. Providing efcient data aggregation while preserving data privacy is a challenging problem in wireless sensor networks research. In this paper, we present privacy-preserving data aggregation scheme for additive aggregation functions. The Cluster-based Private Data Aggregation (CPDA)leverages clustering protocol and algebraic properties of polynomials. It has the advantage of incurring less communication overhead. The goal of our work is to bridge the gap between collaborative data collection by wireless sensor networks and data privacy. We present simulation results of our schemes and compare their performance to a typical data aggregation scheme TAG, where no data privacy protection is provided. Results show the efficacy and efficiency of our schemes.

Keywords: Aggregation, Clustering, Query Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
7410 Study on Characterization of Tuncbilek Fly Ash

Authors: A.S. Kipcak, N. Baran Acarali, S. Kolemen, N. Tugrul, E. Moroydor Derun, S. Piskin

Abstract:

Fly ash is one of the residues generated in combustion, and comprises the fine particles that rise with the flue gases. Ash which does not rise is termed bottom ash [1]. In our country, it is expected that will be occurred 50 million tons of waste ash per year until 2020. Released waste from the thermal power plants is caused very significant problems as known. The fly ashes can be evaluated by using as adsorbent material. The purpose of this study is to investigate the possibility of use of Tuncbilek fly ash like low-cost adsorbents for heavy metal adsorption. First of all, Tuncbilek fly ash was characterized. For this purpose; analysis such as sieve analysis, XRD, XRF, SEM and FT-IR were performed.

Keywords: Fly ash, heavy metal, sieve, adsorbent

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1845
7409 Energy Loss at Drops using Neuro Solutions

Authors: Farzin Salmasi

Abstract:

Energy dissipation in drops has been investigated by physical models. After determination of effective parameters on the phenomenon, three drops with different heights have been constructed from Plexiglas. They have been installed in two existing flumes in the hydraulic laboratory. Several runs of physical models have been undertaken to measured required parameters for determination of the energy dissipation. Results showed that the energy dissipation in drops depend on the drop height and discharge. Predicted relative energy dissipations varied from 10.0% to 94.3%. This work has also indicated that the energy loss at drop is mainly due to the mixing of the jet with the pool behind the jet that causes air bubble entrainment in the flow. Statistical model has been developed to predict the energy dissipation in vertical drops denotes nonlinear correlation between effective parameters. Further an artificial neural networks (ANNs) approach was used in this paper to develop an explicit procedure for calculating energy loss at drops using NeuroSolutions. Trained network was able to predict the response with R2 and RMSE 0.977 and 0.0085 respectively. The performance of ANN was found effective when compared to regression equations in predicting the energy loss.

Keywords: Air bubble, drop, energy loss, hydraulic jump, NeuroSolutions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
7408 Application of Lattice Boltzmann Methods in Heat and Moisture Transfer in Frozen Soil

Authors: Wenyu Song, Bingxi Li, Zhongbin Fu, Bo Zhang

Abstract:

Although water only takes a little percentage in the total mass of soil, it indeed plays an important role to the strength of structure. Moisture transfer can be carried out by many different mechanisms which may involve heat and mass transfer, thermodynamic phase change, and the interplay of various forces such as viscous, buoyancy, and capillary forces. The continuum models are not well suited for describing those phenomena in which the connectivity of the pore space or the fracture network, or that of a fluid phase, plays a major role. However, Lattice Boltzmann methods (LBMs) are especially well suited to simulate flows around complex geometries. Lattice Boltzmann methods were initially invented for solving fluid flows. Recently, fluid with multicomponent and phase change is also included in the equations. By comparing the numerical result with experimental result, the Lattice Boltzmann methods with phase change will be optimized.

Keywords: Frozen soil, Lattice Boltzmann method, Phase change, Test rig.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731