Search results for: Approximation Distribution Reductions in Multigranulation Rough Set Model
8917 Influence of IMV on Space Station
Authors: Fu Shiming, Pei Yifei
Abstract:
To study the impact of the inter-module ventilation (IMV) on the space station, the Computational Fluid Dynamic (CFD) model under the influence of IMV, the mathematical model, boundary conditions and calculation method are established and determined to analyze the influence of IMV on cabin air flow characteristics and velocity distribution firstly; and then an integrated overall thermal mathematical model of the space station is used to consider the impact of IMV on thermal management. The results show that: the IMV has a significant influence on the cabin air flow, the flowrate of IMV within a certain range can effectively improve the air velocity distribution in cabin, if too much may lead to its deterioration; IMV can affect the heat deployment of the different modules in space station, thus affecting its thermal management, the use of IMV can effectively maintain the temperature levels of the different modules and help the space station to dissipate the waste heat.
Keywords: CFD, Environment control and life support, Space station, Thermal management, Thermal mathematical model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20608916 Minimization of Power Loss in Distribution Networks by Different Techniques
Authors: L.Ramesh, S.P.Chowdhury, S.Chowdhury, A.A.Natarajan, C.T.Gaunt
Abstract:
Accurate loss minimization is the critical component for efficient electrical distribution power flow .The contribution of this work presents loss minimization in power distribution system through feeder restructuring, incorporating DG and placement of capacitor. The study of this work was conducted on IEEE distribution network and India Electricity Board benchmark distribution system. The executed experimental result of Indian system is recommended to board and implement practically for regulated stable output.Keywords: Distribution system, Distributed Generation LossMinimization, Network Restructuring
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 62338915 Block Sorting: A New Characterization and a New Heuristic
Authors: Swapnoneel Roy, Ashok Kumar Thakur, Minhazur Rahman
Abstract:
The Block Sorting problem is to sort a given permutation moving blocks. A block is defined as a substring of the given permutation, which is also a substring of the identity permutation. Block Sorting has been proved to be NP-Hard. Until now two different 2-Approximation algorithms have been presented for block sorting. These are the best known algorithms for Block Sorting till date. In this work we present a different characterization of Block Sorting in terms of a transposition cycle graph. Then we suggest a heuristic, which we show to exhibit a 2-approximation performance guarantee for most permutations.Keywords: Block Sorting, Optical Character Recognition, Genome Rearrangements, Sorting Primitives, ApproximationAlgorithms
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21398914 Iteration Acceleration for Nonlinear Coupled Parabolic-Hyperbolic System
Authors: Xia Cui, Guang-wei Yuan, Jing-yan Yue
Abstract:
A Picard-Newton iteration method is studied to accelerate the numerical solution procedure of a class of two-dimensional nonlinear coupled parabolic-hyperbolic system. The Picard-Newton iteration is designed by adding higher-order terms of small quantity to an existing Picard iteration. The discrete functional analysis and inductive hypothesis reasoning techniques are used to overcome difficulties coming from nonlinearity and coupling, and theoretical analysis is made for the convergence and approximation properties of the iteration scheme. The Picard-Newton iteration has a quadratic convergent ratio, and its solution has second order spatial approximation and first order temporal approximation to the exact solution of the original problem. Numerical tests verify the results of the theoretical analysis, and show the Picard-Newton iteration is more efficient than the Picard iteration.
Keywords: Nonlinearity, iterative acceleration, coupled parabolic hyperbolic system, quadratic convergence, numerical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15578913 Dynamic Economic Dispatch Constrained by Wind Power Weibull Distribution: A Here-and-Now Strategy
Authors: Mostafa A. Elshahed, Magdy M. Elmarsfawy, Hussain M. Zain Eldain
Abstract:
In this paper, a Dynamic Economic Dispatch (DED) model is developed for the system consisting of both thermal generators and wind turbines. The inclusion of a significant amount of wind energy into power systems has resulted in additional constraints on DED to accommodate the intermittent nature of the output. The probability of stochastic wind power based on the Weibull probability density function is included in the model as a constraint; A Here-and-Now Approach. The Environmental Protection Agency-s hourly emission target, which gives the maximum emission during the day, is used as a constraint to reduce the atmospheric pollution. A 69-bus test system with non-smooth cost function is used to illustrate the effectiveness of the proposed model compared with static economic dispatch model with including the wind power.
Keywords: Dynamic Economic Dispatch, StochasticOptimization, Weibull Distribution, Wind Power
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29608912 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function
Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos
Abstract:
Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19678911 The Influence of Beta Shape Parameters in Project Planning
Authors: Αlexios Kotsakis, Stefanos Katsavounis, Dimitra Alexiou
Abstract:
Networks can be utilized to represent project planning problems, using nodes for activities and arcs to indicate precedence relationship between them. For fixed activity duration, a simple algorithm calculates the amount of time required to complete a project, followed by the activities that comprise the critical path. Program Evaluation and Review Technique (PERT) generalizes the above model by incorporating uncertainty, allowing activity durations to be random variables, producing nevertheless a relatively crude solution in planning problems. In this paper, based on the findings of the relevant literature, which strongly suggests that a Beta distribution can be employed to model earthmoving activities, we utilize Monte Carlo simulation, to estimate the project completion time distribution and measure the influence of skewness, an element inherent in activities of modern technical projects. We also extract the activity criticality index, with an ultimate goal to produce more accurate planning estimations.
Keywords: Beta distribution, PERT, Monte Carlo Simulation, skewness, project completion time distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7708910 Optical and Double Folding Analysis for 6Li+16O Elastic Scattering
Authors: Abd Elrahman Elgamala, N. Darwish, I. Bondouk, Sh. Hamada
Abstract:
Available experimental angular distributions for 6Li elastically scattered from 16O nucleus in the energy range 13.0–50.0 MeV are investigated and reanalyzed using optical model of the conventional phenomenological potential and also using double folding optical model of different interaction models: DDM3Y1, CDM3Y1, CDM3Y2, and CDM3Y3. All the involved models of interaction are of M3Y Paris except DDM3Y1 which is of M3Y Reid and the main difference between them lies in the different values for the parameters of the incorporated density distribution function F(ρ). We have extracted the renormalization factor NR for 6Li+16O nuclear system in the energy range 13.0–50.0 MeV using the aforementioned interaction models.
Keywords: Elastic scattering, optical model, folding potential, density distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5508909 Reliability Analysis in Electrical Distribution System Considering Preventive Maintenance Applications on Circuit Breakers
Authors: Mahmud Fotuhi-Firuzabad, Saeed Afshar
Abstract:
This paper presents the results of a preventive maintenance application-based study and modeling of failure rates in breakers of electrical distribution systems. This is a critical issue in the reliability assessment of a system. In the analysis conducted in this paper, the impacts of failure rate variations caused by a preventive maintenance are examined. This is considered as a part of a Reliability Centered Maintenance (RCM) application program. A number of load point reliability indices is derived using the mathematical model of the failure rate, which is established using the observed data in a distribution system.
Keywords: Reliability-Centered Maintenance (RCM), failure rate, preventive maintenance (PM), Distribution System Reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24958908 Blind Low Frequency Watermarking Method
Authors: Dimitar Taskovski, Sofija Bogdanova, Momcilo Bogdanov
Abstract:
We present a low frequency watermarking method adaptive to image content. The image content is analyzed and properties of HVS are exploited to generate a visual mask of the same size as the approximation image. Using this mask we embed the watermark in the approximation image without degrading the image quality. Watermark detection is performed without using the original image. Experimental results show that the proposed watermarking method is robust against most common image processing operations, which can be easily implemented and usually do not degrade the image quality.Keywords: Blind, digital watermarking, low frequency, visualmask.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15428907 Buckling Analysis of Rectangular Plates under the Combined Action of Shear and Uniaxial Stresses
Authors: V. Piscopo
Abstract:
In the classical buckling analysis of rectangular plates subjected to the concurrent action of shear and uniaxial forces, the Euler shear buckling stress is generally evaluated separately, so that no influence on the shear buckling coefficient, due to the in-plane tensile or compressive forces, is taken into account. In this paper the buckling problem of simply supported rectangular plates, under the combined action of shear and uniaxial forces, is discussed from the beginning, in order to obtain new project formulas for the shear buckling coefficient that take into account the presence of uniaxial forces. Furthermore, as the classical expression of the shear buckling coefficient for simply supported rectangular plates is considered only a “rough" approximation, as the exact one is defined by a system of intersecting curves, the convergence and the goodness of the classical solution are analyzed, too. Finally, as the problem of the Euler shear buckling stress evaluation is a very important topic for a variety of structures, (e.g. ship ones), two numerical applications are carried out, in order to highlight the role of the uniaxial stresses on the plating scantling procedures and the goodness of the proposed formulas.Keywords: Buckling analysis, Shear, Uniaxial Stresses.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29368906 Aggregation Scheduling Algorithms in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.Keywords: Data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7998905 Estimation of R= P [Y < X] for Two-parameter Burr Type XII Distribution
Abstract:
In this article, we consider the estimation of P[Y < X], when strength, X and stress, Y are two independent variables of Burr Type XII distribution. The MLE of the R based on one simple iterative procedure is obtained. Assuming that the common parameter is known, the maximum likelihood estimator, uniformly minimum variance unbiased estimator and Bayes estimator of P[Y < X] are discussed. The exact confidence interval of the R is also obtained. Monte Carlo simulations are performed to compare the different proposed methods.
Keywords: Stress-Strength model, Maximum likelihood estimator, Bayes estimator, Burr type XII distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22958904 Normalizing Logarithms of Realized Volatility in an ARFIMA Model
Authors: G. L. C. Yap
Abstract:
Modelling realized volatility with high-frequency returns is popular as it is an unbiased and efficient estimator of return volatility. A computationally simple model is fitting the logarithms of the realized volatilities with a fractionally integrated long-memory Gaussian process. The Gaussianity assumption simplifies the parameter estimation using the Whittle approximation. Nonetheless, this assumption may not be met in the finite samples and there may be a need to normalize the financial series. Based on the empirical indices S&P500 and DAX, this paper examines the performance of the linear volatility model pre-treated with normalization compared to its existing counterpart. The empirical results show that by including normalization as a pre-treatment procedure, the forecast performance outperforms the existing model in terms of statistical and economic evaluations.
Keywords: Long-memory, Gaussian process, Whittle estimator, normalization, volatility, value-at-risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16888903 Applications of Stable Distributions in Time Series Analysis, Computer Sciences and Financial Markets
Authors: Mohammad Ali Baradaran Ghahfarokhi, Parvin Baradaran Ghahfarokhi
Abstract:
In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.
Keywords: stable distribution, SaS, infinite variance, heavy tail networks, VaR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20618902 Dynamic Features Selection for Heart Disease Classification
Authors: Walid MOUDANI
Abstract:
The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.Keywords: Multi-Classifier Decisions Tree, Features Reduction, Dynamic Programming, Rough Sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25328901 A Bathtub Curve from Nonparametric Model
Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos
Abstract:
This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.
Keywords: Bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22348900 Fuzzy Shortest Paths Approximation for Solving the Fuzzy Steiner Tree Problem in Graphs
Authors: Miloš Šeda
Abstract:
In this paper, we deal with the Steiner tree problem (STP) on a graph in which a fuzzy number, instead of a real number, is assigned to each edge. We propose a modification of the shortest paths approximation based on the fuzzy shortest paths (FSP) evaluations. Since a fuzzy min operation using the extension principle leads to nondominated solutions, we propose another approach to solving the FSP using Cheng's centroid point fuzzy ranking method.Keywords: Steiner tree, single shortest path problem, fuzzyranking, binary heap, priority queue.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16958899 Modeling and Identification of Hammerstein System by using Triangular Basis Functions
Authors: K. Elleuch, A. Chaari
Abstract:
This paper deals with modeling and parameter identification of nonlinear systems described by Hammerstein model having Piecewise nonlinear characteristics such as Dead-zone nonlinearity characteristic. The simultaneous use of both an easy decomposition technique and the triangular basis functions leads to a particular form of Hammerstein model. The approximation by using Triangular basis functions for the description of the static nonlinear block conducts to a linear regressor model, so that least squares techniques can be used for the parameter estimation. Singular Values Decomposition (SVD) technique has been applied to separate the coupled parameters. The proposed approach has been efficiently tested on academic examples of simulation.Keywords: Identification, Hammerstein model, Piecewisenonlinear characteristic, Dead-zone nonlinearity, Triangular basisfunctions, Singular Values Decomposition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19208898 The Current Situation of Ang Thong Province’s Court Doll Distribution
Authors: P. Waiyawuththanapoom
Abstract:
This research is objected to study the pattern and channel of distribution of Ang Thong’s court doll OTOP product and try to develop the quality of distribution of the court doll product. The population of this research is 50 court doll manufacturers of Ang Thong’s court doll. The data and information was collected by using the questionnaire and use percentage, mean and standard deviation as an analysis tools. The distribution channel of Ang Thong’s court doll can be separated into 3 channels which are direct distribution from the manufacturer, via the middleman and via the co-operated manufacturing group. In the direct distribution from the manufacturer channel, it was found that the manufacturer is given the highest rate of importance to how they keep the inventory. In the distribution via the middleman channel, it was found that the manufacturer is given the highest rate of importance to the distribution efficiency. But in the distribution via the co-operated manufacturing group, it was found that the manufacturer is given the highest rate of importance to the public relationship.
Keywords: Distribution, Court Doll, Ang Thong Province.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14138897 Performance of Bridge Girder with Perforations under Tsunami Wave Loading
Authors: Sadia Rahman, Shatirah Akib, M. T. R. Khan, R. Triatmadja
Abstract:
Tsunami disaster poses a great threat to coastal infrastructures. Bridges without adequate provisions for earthquake and tsunami loading is generally vulnerable to tsunami attack. During the last two disastrous tsunami event (i.e. Indian Ocean and Japan Tsunami) a number of bridges were observed subsequent damages by tsunami waves. In this study, laboratory experiments were conducted to study the effects of perforations in bridge girder in force reduction. Results showed that significant amount of forces were reduced using perforations in girder. Approximately 10% to 18% force reductions were achieved by using about 16% perforations in bridge girder. Subsequent amount of force reductions revealed that perforations in girder are effective in reducing tsunami forces as perforations in girder let water to be passed through. Thus, less bridge damages are expected with the presence of perforations in girder during tsunami period.
Keywords: Bridge, force, girder, perforation, tsunami, wave.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23178896 Design of the Mathematical Model of the Respiratory System Using Electro-acoustic Analogy
Authors: M. Rozanek, K. Roubik
Abstract:
The article deals with development, design and implementation of a mathematical model of the human respiratory system. The model is designed in order to simulate distribution of important intrapulmonary parameters along the bronchial tree such as pressure amplitude, tidal volume and effect of regional mechanical lung properties upon the efficiency of various ventilatory techniques. Therefore exact agreement of the model structure with the lung anatomical structure is required. The model is based on the lung morphology and electro-acoustic analogy is used to design the model.Keywords: Model of the respiratory system, total lung impedance, intrapulmonary parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18368895 Thermal Analysis of the Current Path from Circuit Breakers Using Finite Element Method
Authors: Adrian T. Plesca
Abstract:
This paper describes a three-dimensional thermal model of the current path included in the low voltage power circuit breakers. The model can be used to analyse the thermal behaviour of the current path during both steady-state and transient conditions. The current path lengthwise temperature distribution and timecurrent characteristic of the terminal connections of the power circuit breaker have been obtained. The influence of the electric current and voltage drop on main electric contact of the circuit breaker has been investigated. To validate the three-dimensional thermal model, some experimental tests have been done. There is a good correlation between experimental and simulation results.Keywords: Current path, power circuit breakers, temperature distribution, thermal analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26968894 Control Technology for a Daily Load-following Operation in a Nuclear Power Plant
Authors: Keuk Jong Yu, Sang Hee Kang, Sung Chang You
Abstract:
In Korea, the technology of a load fo nuclear power plant has been being developed. automatic controller which is able to control temperature and axial power distribution was developed. identification algorithm and a model predictive contact former transforms the nuclear reactor status into numerically. And the latter uses them and ge manipulated values such as two kinds of control ro this automatic controller, the performance of a coperation was evaluated. As a result, the automatic generated model parameters of a nuclear react to nuclear reactor average temperature and axial power the desired targets during a daily load follow.Keywords: axial power distribution, model reactor temperature, system identification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21668893 Probabilistic Model Development for Project Performance Forecasting
Authors: Milad Eghtedari Naeini, Gholamreza Heravi
Abstract:
In this paper, based on the past project cost and time performance, a model for forecasting project cost performance is developed. This study presents a probabilistic project control concept to assure an acceptable forecast of project cost performance. In this concept project activities are classified into sub-groups entitled control accounts. Then obtain the Stochastic S-Curve (SS-Curve), for each sub-group and the project SS-Curve is obtained by summing sub-groups- SS-Curves. In this model, project cost uncertainties are considered through Beta distribution functions of the project activities costs required to complete the project at every selected time sections through project accomplishment, which are extracted from a variety of sources. Based on this model, after a percentage of the project progress, the project performance is measured via Earned Value Management to adjust the primary cost probability distribution functions. Then, accordingly the future project cost performance is predicted by using the Monte-Carlo simulation method.Keywords: Monte Carlo method, Probabilistic model, Project forecasting, Stochastic S-curve
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27158892 Using the Polynomial Approximation Algorithm in the Algorithm 2 for Manipulator's Control in an Unknown Environment
Authors: Pavel K. Lopatin, Artyom S. Yegorov
Abstract:
The Algorithm 2 for a n-link manipulator movement amidst arbitrary unknown static obstacles for a case when a sensor system supplies information about local neighborhoods of different points in the configuration space is presented. The Algorithm 2 guarantees the reaching of a target position in a finite number of steps. The Algorithm 2 is reduced to a finite number of calls of a subroutine for planning a trajectory in the presence of known forbidden states. The polynomial approximation algorithm which is used as the subroutine is presented. The results of the Algorithm2 implementation are given.
Keywords: Manipulator, trajectory planning, unknown obstacles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12398891 Adaptive Kalman Filter for Noise Estimation and Identification with Bayesian Approach
Authors: Farhad Asadi, S. Hossein Sadati
Abstract:
Bayesian approach can be used for parameter identification and extraction in state space models and its ability for analyzing sequence of data in dynamical system is proved in different literatures. In this paper, adaptive Kalman filter with Bayesian approach for identification of variances in measurement parameter noise is developed. Next, it is applied for estimation of the dynamical state and measurement data in discrete linear dynamical system. This algorithm at each step time estimates noise variance in measurement noise and state of system with Kalman filter. Next, approximation is designed at each step separately and consequently sufficient statistics of the state and noise variances are computed with a fixed-point iteration of an adaptive Kalman filter. Different simulations are applied for showing the influence of noise variance in measurement data on algorithm. Firstly, the effect of noise variance and its distribution on detection and identification performance is simulated in Kalman filter without Bayesian formulation. Then, simulation is applied to adaptive Kalman filter with the ability of noise variance tracking in measurement data. In these simulations, the influence of noise distribution of measurement data in each step is estimated, and true variance of data is obtained by algorithm and is compared in different scenarios. Afterwards, one typical modeling of nonlinear state space model with inducing noise measurement is simulated by this approach. Finally, the performance and the important limitations of this algorithm in these simulations are explained.
Keywords: adaptive filtering, Bayesian approach Kalman filtering approach, variance tracking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6198890 Nonlinear Model Predictive Control of Water Quality in Drinking Water Distribution Systems with DBPs Objectives
Authors: Mingyu Xie, Mietek Brdys
Abstract:
The paper develops a Non-Linear Model Predictive Control (NMPC) of water quality in Drinking Water Distribution Systems (DWDS) based on the advanced non-linear quality dynamics model including disinfections by-products (DBPs). A special attention is paid to the analysis of an impact of the flow trajectories prescribed by an upper control level of the recently developed two-time scale architecture of an integrated quality and quantity control in DWDS. The new quality controller is to operate within this architecture in the fast time scale as the lower level quality controller. The controller performance is validated by a comprehensive simulation study based on an example case study DWDS.Keywords: Model predictive control, hierarchical control structure, genetic algorithm, water quality with DBPs objectives.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24778889 Classification of Extreme Ground-Level Ozone Based on Generalized Extreme Value Model for Air Monitoring Station
Authors: Siti Aisyah Zakaria, Nor Azrita Mohd Amin, Noor Fadhilah Ahmad Radi, Nasrul Hamidin
Abstract:
Higher ground-level ozone (GLO) concentration adversely affects human health, vegetations as well as activities in the ecosystem. In Malaysia, most of the analysis on GLO concentration are carried out using the average value of GLO concentration, which refers to the centre of distribution to make a prediction or estimation. However, analysis which focuses on the higher value or extreme value in GLO concentration is rarely explored. Hence, the objective of this study is to classify the tail behaviour of GLO using generalized extreme value (GEV) distribution estimation the return level using the corresponding modelling (Gumbel, Weibull, and Frechet) of GEV distribution. The results show that Weibull distribution which is also known as short tail distribution and considered as having less extreme behaviour is the best-fitted distribution for four selected air monitoring stations in Peninsular Malaysia, namely Larkin, Pelabuhan Kelang, Shah Alam, and Tanjung Malim; while Gumbel distribution which is considered as a medium tail distribution is the best-fitted distribution for Nilai station. The return level of GLO concentration in Shah Alam station is comparatively higher than other stations. Overall, return levels increase with increasing return periods but the increment depends on the type of the tail of GEV distribution’s tail. We conduct this study by using maximum likelihood estimation (MLE) method to estimate the parameters at four selected stations in Peninsular Malaysia. Next, the validation for the fitted block maxima series to GEV distribution is performed using probability plot, quantile plot and likelihood ratio test. Profile likelihood confidence interval is tested to verify the type of GEV distribution. These results are important as a guide for early notification on future extreme ozone events.
Keywords: Extreme value theory, generalized extreme value distribution, ground-level ozone, return level.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5178888 Lateral Pressure in Squat Silos under Eccentric Discharge
Authors: Y. Z. Zhu, S. P. Meng, W. W. Sun
Abstract:
The influence of eccentric discharge of stored solids in squat silos has been highly valued by many researchers. However, calculation method of lateral pressure under eccentric flowing still needs to be deeply studied. In particular, the lateral pressure distribution on vertical wall could not be accurately recognized mainly because of its asymmetry. In order to build mechanical model of lateral pressure, flow channel and flow pattern of stored solids in squat silo are studied. In this passage, based on Janssen-s theory, the method for calculating lateral static pressure in squat silos after eccentric discharge is proposed. Calculative formulae are deduced for each of three possible cases. This method is also focusing on unsymmetrical distribution characteristic of silo wall normal pressure. Finite element model is used to analysis and compare the results of lateral pressure and the numerical results illustrate the practicability of the theoretical method.Keywords: Squat silo, eccentric discharge, lateral pressure, asymmetric distribution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3159