Search results for: Pareto Probability Distribution function.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4191

Search results for: Pareto Probability Distribution function.

4131 Steering Velocity Bounded Mobile Robots in Environments with Partially Known Obstacles

Authors: Reza Hossseynie, Amir Jafari

Abstract:

This paper presents a method for steering velocity bounded mobile robots in environments with partially known stationary obstacles. The exact location of obstacles is unknown and only a probability distribution associated with the location of the obstacles is known. Kinematic model of a 2-wheeled differential drive robot is used as the model of mobile robot. The presented control strategy uses the Artificial Potential Field (APF) method for devising a desired direction of movement for the robot at each instant of time while the Constrained Directions Control (CDC) uses the generated direction to produce the control signals required for steering the robot. The location of each obstacle is considered to be the mean value of the 2D probability distribution and similarly, the magnitude of the electric charge in the APF is set as the trace of covariance matrix of the location probability distribution. The method not only captures the challenges of planning the path (i.e. probabilistic nature of the location of unknown obstacles), but it also addresses the output saturation which is considered to be an important issue from the control perspective. Moreover, velocity of the robot can be controlled during the steering. For example, the velocity of robot can be reduced in close vicinity of obstacles and target to ensure safety. Finally, the control strategy is simulated for different scenarios to show how the method can be put into practice.

Keywords: Steering, obstacle avoidance, mobile robots, constrained directions control, artificial potential field.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 864
4130 Deformation of Water Waves by Geometric Transitions with Power Law Function Distribution

Authors: E. G. Bautista, J. M. Reyes, O. Bautista, J. C. Arcos

Abstract:

In this work, we analyze the deformation of surface waves in shallow flows conditions, propagating in a channel of slowly varying cross-section. Based on a singular perturbation technique, the main purpose is to predict the motion of waves by using a dimensionless formulation of the governing equations, considering that the longitudinal variation of the transversal section obey a power-law distribution. We show that the spatial distribution of the waves in the varying cross-section is a function of a kinematic parameter,κ , and two geometrical parameters εh and w ε . The above spatial behavior of the surface elevation is modeled by an ordinary differential equation. The use of single formulas to model the varying cross sections or transitions considered in this work can be a useful approximation to natural or artificial geometrical configurations.

Keywords: Surface waves, Asymptotic solution, Power law function, Non-dispersive waves.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1806
4129 Technique for Voltage Control in Distribution System

Authors: S. Thongkeaw, M. Boonthienthong

Abstract:

This paper presents the techniques for voltage control in distribution system. It is integrated in the distribution management system. Voltage is an important parameter for the control of electrical power systems. The distribution network operators have the responsibility to regulate the voltage supplied to consumer within statutory limits. Traditionally, the On-Load Tap Changer (OLTC) transformer equipped with automatic voltage control (AVC) relays is the most popular and effective voltage control device. A static synchronous compensator (STATCOM) may be equipped with several controllers to perform multiple control functions. Static Var Compensation (SVC) is regulation slopes and available margins for var dispatch. The voltage control in distribution networks is established as a centralized analytical function in this paper. 

Keywords: Voltage Control, Reactive Power, Distribution System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9451
4128 Estimation of Broadcast Probability in Wireless Adhoc Networks

Authors: Bharadwaj Kadiyala, Sunitha V

Abstract:

Most routing protocols (DSR, AODV etc.) that have been designed for wireless adhoc networks incorporate the broadcasting operation in their route discovery scheme. Probabilistic broadcasting techniques have been developed to optimize the broadcast operation which is otherwise very expensive in terms of the redundancy and the traffic it generates. In this paper we have explored percolation theory to gain a different perspective on probabilistic broadcasting schemes which have been actively researched in the recent years. This theory has helped us estimate the value of broadcast probability in a wireless adhoc network as a function of the size of the network. We also show that, operating at those optimal values of broadcast probability there is at least 25-30% reduction in packet regeneration during successful broadcasting.

Keywords: Crossover length, Percolation, Probabilistic broadcast, Wireless adhoc networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1548
4127 The Research and Application of M/M/1/N Queuing Model with Variable Input Rates, Variable Service Rates and Impatient Customers

Authors: Quanru Pan

Abstract:

How to maintain the service speeds for the business to make the biggest profit is a problem worthy of study, which is discussed in this paper with the use of queuing theory. An M/M/1/N queuing model with variable input rates, variable service rates and impatient customers is established, and the following conclusions are drawn: the stationary distribution of the model, the relationship between the stationary distribution and the probability that there are n customers left in the system when a customer leaves (not including the customer who leaves himself), the busy period of the system, the average operating cycle, the loss probability for the customers not entering the system while they arriving at the system, the mean of the customers who leaves the system being for impatient, the loss probability for the customers not joining the queue due to the limited capacity of the system and many other indicators. This paper also indicates that the following conclusion is not correct: the more customers the business serve, the more profit they will get. At last, this paper points out the appropriate service speeds the business should keep to make the biggest profit.

Keywords: variable input rates, impatient customer, variable servicerates, profit maximization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913
4126 Thermodynamic Optimization of Turboshaft Engine using Multi-Objective Genetic Algorithm

Authors: S. Farahat, E. Khorasani Nejad, S. M. Hoseini Sarvari

Abstract:

In this paper multi-objective genetic algorithms are employed for Pareto approach optimization of ideal Turboshaft engines. In the multi-objective optimization a number of conflicting objective functions are to be optimized simultaneously. The important objective functions that have been considered for optimization are specific thrust (F/m& 0), specific fuel consumption ( P S ), output shaft power 0 (& /&) shaft W m and overall efficiency( ) O η . These objectives are usually conflicting with each other. The design variables consist of thermodynamic parameters (compressor pressure ratio, turbine temperature ratio and Mach number). At the first stage single objective optimization has been investigated and the method of NSGA-II has been used for multiobjective optimization. Optimization procedures are performed for two and four objective functions and the results are compared for ideal Turboshaft engine. In order to investigate the optimal thermodynamic behavior of two objectives, different set, each including two objectives of output parameters, are considered individually. For each set Pareto front are depicted. The sets of selected decision variables based on this Pareto front, will cause the best possible combination of corresponding objective functions. There is no superiority for the points on the Pareto front figure, but they are superior to any other point. In the case of four objective optimization the results are given in tables.

Keywords: Multi-objective, Genetic algorithm, Turboshaft Engine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863
4125 Daily Probability Model of Storm Events in Peninsular Malaysia

Authors: Mohd Aftar Abu Bakar, Noratiqah Mohd Ariff, Abdul Aziz Jemain

Abstract:

Storm Event Analysis (SEA) provides a method to define rainfalls events as storms where each storm has its own amount and duration. By modelling daily probability of different types of storms, the onset, offset and cycle of rainfall seasons can be determined and investigated. Furthermore, researchers from the field of meteorology will be able to study the dynamical characteristics of rainfalls and make predictions for future reference. In this study, four categories of storms; short, intermediate, long and very long storms; are introduced based on the length of storm duration. Daily probability models of storms are built for these four categories of storms in Peninsular Malaysia. The models are constructed by using Bernoulli distribution and by applying linear regression on the first Fourier harmonic equation. From the models obtained, it is found that daily probability of storms at the Eastern part of Peninsular Malaysia shows a unimodal pattern with high probability of rain beginning at the end of the year and lasting until early the next year. This is very likely due to the Northeast monsoon season which occurs from November to March every year. Meanwhile, short and intermediate storms at other regions of Peninsular Malaysia experience a bimodal cycle due to the two inter-monsoon seasons. Overall, these models indicate that Peninsular Malaysia can be divided into four distinct regions based on the daily pattern for the probability of various storm events.

Keywords: Daily probability model, monsoon seasons, regions, storm events.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
4124 The Study of the Discrete Risk Model with Random Income

Authors: Peichen Zhao

Abstract:

In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.

Keywords: Discounted penalty function, compound binomial process, recursive formula, discrete renewal equation, asymptotic estimate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1372
4123 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.

Keywords: Enhanced ideal gas molecular movement, Kriging, probability-based damage detection, probability of damage existence, surrogate modeling, uncertainty quantification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 910
4122 Conflation Methodology Applied to Flood Recovery

Authors: E. L. Suarez, D. E. Meeroff, Y. Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: Community resilience, conflation, flood risk, nuisance flooding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50
4121 Critical Analysis of Heat Exchanger Cycle for its Maintainability Using Failure Modes and Effect Analysis and Pareto Analysis

Authors: Sayali Vyas, Atharva Desai, Shreyas Badave, Apurv Kulkarni, B. Rajiv

Abstract:

The Failure Modes and Effect Analysis (FMEA) is an efficient evaluation technique to identify potential failures in products, processes, and services. FMEA is designed to identify and prioritize failure modes. It proves to be a useful method for identifying and correcting possible failures at its earliest possible level so that one can avoid consequences of poor performance. In this paper, FMEA tool is used in detection of failures of various components of heat exchanger cycle and to identify critical failures of the components which may hamper the system’s performance. Further, a detailed Pareto analysis is done to find out the most critical components of the cycle, the causes of its failures, and possible recommended actions. This paper can be used as a checklist which will help in maintainability of the system.

Keywords: FMEA, heat exchanger cycle, Ishikawa diagram, Pareto analysis, risk priority number.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
4120 A Data Driven Approach for the Degradation of a Lithium-Ion Battery Based on Accelerated Life Test

Authors: Alyaa M. Younes, Nermine Harraz, Mohammad H. Elwany

Abstract:

Lithium ion batteries are currently used for many applications including satellites, electric vehicles and mobile electronics. Their ability to store relatively large amount of energy in a limited space make them most appropriate for critical applications. Evaluation of the life of these batteries and their reliability becomes crucial to the systems they support. Reliability of Li-Ion batteries has been mainly considered based on its lifetime. However, another important factor that can be considered critical in many applications such as in electric vehicles is the cycle duration. The present work presents the results of an experimental investigation on the degradation behavior of a Laptop Li-ion battery (type TKV2V) and the effect of applied load on the battery cycle time. The reliability was evaluated using an accelerated life test. Least squares linear regression with median rank estimation was used to estimate the Weibull distribution parameters needed for the reliability functions estimation. The probability density function, failure rate and reliability function under each of the applied loads were evaluated and compared. An inverse power model is introduced that can predict cycle time at any stress level given.

Keywords: Accelerated life test, inverse power law, lithium ion battery, reliability evaluation, Weibull distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 785
4119 Performance Evaluation of Karanja Oil Based Biodiesel Engine Using Modified Genetic Algorithm

Authors: G. Bhushan, S. Dhingra, K. K. Dubey

Abstract:

This paper presents the evaluation of performance (BSFC and BTE), combustion (Pmax) and emission (CO, NOx, HC and smoke opacity) parameters of karanja biodiesel in a single cylinder, four stroke, direct injection diesel engine by considering significant engine input parameters (blending ratio, compression ratio and load torque). Multi-objective optimization of performance, combustion and emission parameters is also carried out in a karanja biodiesel engine using hybrid RSM-NSGA-II technique. The pareto optimum solutions are predicted by running the hybrid RSM-NSGA-II technique. Each pareto optimal solution is having its own importance. Confirmation tests are also conducted at randomly selected few pareto solutions to check the authenticity of the results.

Keywords: Karanja biodiesel, single cylinder direct injection diesel engine, response surface methodology, central composite rotatable design, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1111
4118 The Modified Eigenface Method using Two Thresholds

Authors: Yan Ma, ShunBao Li

Abstract:

A new approach is adopted in this paper based on Turk and Pentland-s eigenface method. It was found that the probability density function of the distance between the projection vector of the input face image and the average projection vector of the subject in the face database, follows Rayleigh distribution. In order to decrease the false acceptance rate and increase the recognition rate, the input face image has been recognized using two thresholds including the acceptance threshold and the rejection threshold. We also find out that the value of two thresholds will be close to each other as number of trials increases. During the training, in order to reduce the number of trials, the projection vectors for each subject has been averaged. The recognition experiments using the proposed algorithm show that the recognition rate achieves to 92.875% whilst the average number of judgment is only 2.56 times.

Keywords: Eigenface, Face Recognition, Threshold, Rayleigh Distribution, Feature Extraction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1449
4117 Stochastic Risk Analysis Framework for Building Construction Projects

Authors: Abdulkadir Abu Lawal

Abstract:

The study was carried out to establish the probability density function of some selected building construction projects of similar complexity delivered using Bill of Quantities (BQ) and Lump Sum (LS) forms of contract, and to draw a reliability scenario for each form of contract. 30 of such delivered projects are analyzed for each of the contract forms using Weibull Analysis, and their Weibull functions (α, and β) are determined based on their completion times. For the BQ form of contract delivered projects, α is calculated as 1.6737E20 and β as + 0.0115 and for the LS form, α is found to be 5.6556E03 and β is determined as + 0.4535. Using these values, respective probability density functions are calculated and plotted, as handy tool for risk analysis of future projects of similar characteristics. By input of variables from other projects, decision making processes can be made for a whole project or its components using EVM Analysis in project evaluation and review techniques. This framework, as a quantitative approach, depends on the assumption of normality in projects completion time, it can help greatly in determining the completion time probability for veritable projects using any of the contract forms under consideration. Projects aspects that are not amenable to measurement, on the other hand, can be analyzed using fuzzy sets and fuzzy logic. This scenario can be drawn for different types of building construction projects, and using different suitable forms of contract in projects delivery.

Keywords: Building construction, Projects, Forms of contract, Probability density function, Reliability scenario.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 725
4116 A Stochastic Approach to Extreme Wind Speeds Conditions on a Small Axial Wind Turbine

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, to model a real life wind turbine, a probabilistic approach is proposed to model the dynamics of the blade elements of a small axial wind turbine under extreme stochastic wind speeds conditions. It was found that the power and the torque probability density functions even-dough decreases at these extreme wind speeds but are not infinite. Moreover, we also fund that it is possible to stabilize the power coefficient (stabilizing the output power)above rated wind speeds by turning some control parameters. This method helps to explain the effect of turbulence on the quality and quantity of the harness power and aerodynamic torque.

Keywords: Probability, Stochastic, Probability density function, Turbulence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
4115 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis

Authors: Petr Gurný

Abstract:

One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the creditscoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.

Keywords: Credit-scoring Models, Multidimensional Subordinated Lévy Model, Probability of Default.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1887
4114 Stability Bound of Ruin Probability in a Reduced Two-Dimensional Risk Model

Authors: Zina Benouaret, Djamil Aissani

Abstract:

In this work, we introduce the qualitative and quantitative concept of the strong stability method in the risk process modeling two lines of business of the same insurance company or an insurance and re-insurance companies that divide between them both claims and premiums with a certain proportion. The approach proposed is based on the identification of the ruin probability associate to the model considered, with a stationary distribution of a Markov random process called a reversed process. Our objective, after clarifying the condition and the perturbation domain of parameters, is to obtain the stability inequality of the ruin probability which is applied to estimate the approximation error of a model with disturbance parameters by the considered model. In the stability bound obtained, all constants are explicitly written.

Keywords: Markov chain, risk models, ruin probabilities, strong stability analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 821
4113 Structural Modelling of the LiCl Aqueous Solution: Using the Hybrid Reverse Monte Carlo (HRMC) Simulation

Authors: M. Habchi, S.M. Mesli, M. Kotbi

Abstract:

The Reverse Monte Carlo (RMC) simulation is applied in the study of an aqueous electrolyte LiCl6H2O. On the basis of the available experimental neutron scattering data, RMC computes pair radial distribution functions in order to explore the structural features of the system. The obtained results include some unrealistic features. To overcome this problem, we use the Hybrid Reverse Monte Carlo (HRMC), incorporating an energy constraint in addition to the commonly used constraints derived from experimental data. Our results show a good agreement between experimental and computed partial distribution functions (PDFs) as well as a significant improvement in pair partial distribution curves. This kind of study can be considered as a useful test for a defined interaction model for conventional simulation techniques.

Keywords: RMC simulation, HRMC simulation, energy constraint, screened potential, glassy state, liquid state, partial distribution function, pair partial distribution function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1422
4112 Image Mapping with Cumulative Distribution Function for Quick Convergence of Counter Propagation Neural Networks in Image Compression

Authors: S. Anna Durai, E. Anna Saro

Abstract:

In general the images used for compression are of different types like dark image, high intensity image etc. When these images are compressed using Counter Propagation Neural Network, it takes longer time to converge. The reason for this is that the given image may contain a number of distinct gray levels with narrow difference with their neighborhood pixels. If the gray levels of the pixels in an image and their neighbors are mapped in such a way that the difference in the gray levels of the neighbor with the pixel is minimum, then compression ratio as well as the convergence of the network can be improved. To achieve this, a Cumulative Distribution Function is estimated for the image and it is used to map the image pixels. When the mapped image pixels are used the Counter Propagation Neural Network yield high compression ratio as well as it converges quickly.

Keywords: Correlation, Counter Propagation Neural Networks, Cummulative Distribution Function, Image compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
4111 Percolation Transition with Hidden Variables in Complex Networks

Authors: Zhanli Zhang, Wei Chen, Xin Jiang, Lili Ma, Shaoting Tang, Zhiming Zheng

Abstract:

A new class of percolation model in complex networks, in which nodes are characterized by hidden variables reflecting the properties of nodes and the occupied probability of each link is determined by the hidden variables of the end nodes, is studied in this paper. By the mean field theory, the analytical expressions for the phase of percolation transition is deduced. It is determined by the distribution of the hidden variables for the nodes and the occupied probability between pairs of them. Moreover, the analytical expressions obtained are checked by means of numerical simulations on a particular model. Besides, the general model can be applied to describe and control practical diffusion models, such as disease diffusion model, scientists cooperation networks, and so on.

Keywords: complex networks, percolation transition, hidden variable, occupied probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
4110 Pure Scalar Equilibria for Normal-Form Games

Authors: H. W. Corley

Abstract:

A scalar equilibrium (SE) is an alternative type of equilibrium in pure strategies for an n-person normal-form game G. It is defined using optimization techniques to obtain a pure strategy for each player of G by maximizing an appropriate utility function over the acceptable joint actions. The players’ actions are determined by the choice of the utility function. Such a utility function could be agreed upon by the players or chosen by an arbitrator. An SE is an equilibrium since no players of G can increase the value of this utility function by changing their strategies. SEs are formally defined, and examples are given. In a greedy SE, the goal is to assign actions to the players giving them the largest individual payoffs jointly possible. In a weighted SE, each player is assigned weights modeling the degree to which he helps every player, including himself, achieve as large a payoff as jointly possible. In a compromise SE, each player wants a fair payoff for a reasonable interpretation of fairness. In a parity SE, the players want their payoffs to be as nearly equal as jointly possible. Finally, a satisficing SE achieves a personal target payoff value for each player. The vector payoffs associated with each of these SEs are shown to be Pareto optimal among all such acceptable vectors, as well as computationally tractable.

Keywords: Compromise equilibrium, greedy equilibrium, normal-form game, parity equilibrium, pure strategies, satisficing equilibrium, scalar equilibria, utility function, weighted equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 158
4109 Application Reliability Method for Concrete Dams

Authors: Mustapha Kamel Mihoubi, Mohamed Essadik Kerkar

Abstract:

Probabilistic risk analysis models are used to provide a better understanding of the reliability and structural failure of works, including when calculating the stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of reliability analysis methods including the methods used in engineering. It is in our case, the use of level 2 methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type first order risk method (FORM) and the second order risk method (SORM). By way of comparison, a level three method was used which generates a full analysis of the problem and involves an integration of the probability density function of random variables extended to the field of security using the Monte Carlo simulation method. Taking into account the change in stress following load combinations: normal, exceptional and extreme acting on the dam, calculation of the results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities, thus causing a significant decrease in strength, shear forces then induce a shift that threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case the increase of uplift in a hypothetical default of the drainage system.

Keywords: Dam, failure, limit-state, Monte Carlo simulation, reliability, probability, simulation, sliding, Taylor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1186
4108 Modelling Hydrological Time Series Using Wakeby Distribution

Authors: Ilaria Lucrezia Amerise

Abstract:

The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.

Keywords: Generalized extreme values (GEV), likelihood estimation, precipitation data, Wakeby distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 620
4107 An Overview of Handoff Techniques in Cellular Networks

Authors: Nasıf Ekiz, Tara Salih, Sibel Küçüköner, Kemal Fidanboylu

Abstract:

Continuation of an active call is one of the most important quality measurements in the cellular systems. Handoff process enables a cellular system to provide such a facility by transferring an active call from one cell to another. Different approaches are proposed and applied in order to achieve better handoff service. The principal parameters used to evaluate handoff techniques are: forced termination probability and call blocking probability. The mechanisms such as guard channels and queuing handoff calls decrease the forced termination probability while increasing the call blocking probability. In this paper we present an overview about the issues related to handoff initiation and decision and discuss about different types of handoff techniques available in the literature.

Keywords: Handoff, Forced Termination Probability, Blocking probability, Handoff Initiation, Handoff Decision, Handoff Prioritization Schemes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6537
4106 A Heuristic Statistical Model for Lifetime Distribution Analysis of Complicated Systems in the Reliability Centered Maintenance

Authors: Mojtaba Mahdavi, Mohamad Mahdavi, Maryam Yazdani

Abstract:

A heuristic conceptual model for to develop the Reliability Centered Maintenance (RCM), especially in preventive strategy, has been explored during this paper. In most real cases which complicity of system obligates high degree of reliability, this model proposes a more appropriate reliability function between life time distribution based and another which is based on relevant Extreme Value (EV) distribution. A statistical and mathematical approach is used to estimate and verify these two distribution functions. Then best one is chosen just among them, whichever is more reliable. A numeric Industrial case study will be reviewed to represent the concepts of this paper, more clearly.

Keywords: Lifetime distribution, Reliability, Estimation, Extreme value, Improving model, Series, Parallel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1437
4105 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: Missing values, distance metric, Bhattacharyya distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 734
4104 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.

Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 795
4103 A Novel Neighborhood Defined Feature Selection on Phase Congruency Images for Recognition of Faces with Extreme Variations

Authors: Satyanadh Gundimada, Vijayan K Asari

Abstract:

A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.

Keywords: Discriminant analysis, intra-class probability distribution, principal component analysis, phase congruency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1810
4102 Probabilistic Modeling of Network-induced Delays in Networked Control Systems

Authors: Manoj Kumar, A.K. Verma, A. Srividya

Abstract:

Time varying network induced delays in networked control systems (NCS) are known for degrading control system-s quality of performance (QoP) and causing stability problems. In literature, a control method employing modeling of communication delays as probability distribution, proves to be a better method. This paper focuses on modeling of network induced delays as probability distribution. CAN and MIL-STD-1553B are extensively used to carry periodic control and monitoring data in networked control systems. In literature, methods to estimate only the worst-case delays for these networks are available. In this paper probabilistic network delay model for CAN and MIL-STD-1553B networks are given. A systematic method to estimate values to model parameters from network parameters is given. A method to predict network delay in next cycle based on the present network delay is presented. Effect of active network redundancy and redundancy at node level on network delay and system response-time is also analyzed.

Keywords: NCS (networked control system), delay analysis, response-time distribution, worst-case delay, CAN, MIL-STD-1553B, redundancy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738