Search results for: conditional probability
606 Probability and Instruction Effects in Syllogistic Conditional Reasoning
Authors: Olimpia Matarazzo, Ivana Baldassarre
Abstract:
The main aim of this study was to examine whether people understand indicative conditionals on the basis of syntactic factors or on the basis of subjective conditional probability. The second aim was to investigate whether the conditional probability of q given p depends on the antecedent and consequent sizes or derives from inductive processes leading to establish a link of plausible cooccurrence between events semantically or experientially associated. These competing hypotheses have been tested through a 3 x 2 x 2 x 2 mixed design involving the manipulation of four variables: type of instructions (“Consider the following statement to be true", “Read the following statement" and condition with no conditional statement); antecedent size (high/low); consequent size (high/low); statement probability (high/low). The first variable was between-subjects, the others were within-subjects. The inferences investigated were Modus Ponens and Modus Tollens. Ninety undergraduates of the Second University of Naples, without any prior knowledge of logic or conditional reasoning, participated in this study. Results suggest that people understand conditionals in a syntactic way rather than in a probabilistic way, even though the perception of the conditional probability of q given p is at least partially involved in the conditionals- comprehension. They also showed that, in presence of a conditional syllogism, inferences are not affected by the antecedent or consequent sizes. From a theoretical point of view these findings suggest that it would be inappropriate to abandon the idea that conditionals are naturally understood in a syntactic way for the idea that they are understood in a probabilistic way.Keywords: Conditionals, conditional probability, conditional syllogism, inferential task.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562605 Probabilities and the Persistence of Memory in a Bingo-like Carnival Game
Authors: M. Glomski, M. Lopes
Abstract:
Seemingly simple probabilities in the m-player game bingo have never been calculated. These probabilities include expected game length and the expected number of winners on a given turn. The difficulty in probabilistic analysis lies in the subtle interdependence among the m-many bingo game cards in play. In this paper, the game i got it!, a bingo variant, is considered. This variation provides enough weakening of the inter-player dependence to allow probabilistic analysis not possible for traditional bingo. The probability of winning in exactly k turns is calculated for a one-player game. Given a game of m-many players, the expected game length and tie probability are calculated. With these calculations, the game-s interesting payout scheme is considered.
Keywords: Conditional probability, games of chance, npersongames, probability theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527604 A Formal Approach for Proof Constructions in Cryptography
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this article we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (σ-algebras, probability spaces and conditional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes- Formula. Besides, we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this article shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in cryptographic research, if the corresponding basic mathematical knowledge is available in a database.Keywords: prime numbers, primality tests, (conditional) probabilitydistributions, formal proof system, higher-order logic, formalverification, Bayes' Formula, Miller-Rabin primality test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469603 Vision Based Hand Gesture Recognition Using Generative and Discriminative Stochastic Models
Authors: Mahmoud Elmezain, Samar El-shinawy
Abstract:
Many approaches to pattern recognition are founded on probability theory, and can be broadly characterized as either generative or discriminative according to whether or not the distribution of the image features. Generative and discriminative models have very different characteristics, as well as complementary strengths and weaknesses. In this paper, we study these models to recognize the patterns of alphabet characters (A-Z) and numbers (0-9). To handle isolated pattern, generative model as Hidden Markov Model (HMM) and discriminative models like Conditional Random Field (CRF), Hidden Conditional Random Field (HCRF) and Latent-Dynamic Conditional Random Field (LDCRF) with different number of window size are applied on extracted pattern features. The gesture recognition rate is improved initially as the window size increase, but degrades as window size increase further. Experimental results show that the LDCRF is the best in terms of results than CRF, HCRF and HMM at window size equal 4. Additionally, our results show that; an overall recognition rates are 91.52%, 95.28%, 96.94% and 98.05% for CRF, HCRF, HMM and LDCRF respectively.
Keywords: Statistical Pattern Recognition, Generative Model, Discriminative Model, Human Computer Interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2936602 Reliability Analysis of Underground Pipelines Using Subset Simulation
Authors: Kong Fah Tee, Lutfor Rahman Khan, Hongshuang Li
Abstract:
An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.
Keywords: Underground pipelines, Probability of failure, Reliability and Subset Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3554601 Pragati Node Popularity (PNP) Approach to Identify Congestion Hot Spots in MPLS
Authors: E. Ramaraj, A. Padmapriya
Abstract:
In large Internet backbones, Service Providers typically have to explicitly manage the traffic flows in order to optimize the use of network resources. This process is often referred to as Traffic Engineering (TE). Common objectives of traffic engineering include balance traffic distribution across the network and avoiding congestion hot spots. Raj P H and SVK Raja designed the Bayesian network approach to identify congestion hors pots in MPLS. In this approach for every node in the network the Conditional Probability Distribution (CPD) is specified. Based on the CPD the congestion hot spots are identified. Then the traffic can be distributed so that no link in the network is either over utilized or under utilized. Although the Bayesian network approach has been implemented in operational networks, it has a number of well known scaling issues. This paper proposes a new approach, which we call the Pragati (means Progress) Node Popularity (PNP) approach to identify the congestion hot spots with the network topology alone. In the new Pragati Node Popularity approach, IP routing runs natively over the physical topology rather than depending on the CPD of each node as in Bayesian network. We first illustrate our approach with a simple network, then present a formal analysis of the Pragati Node Popularity approach. Our PNP approach shows that for any given network of Bayesian approach, it exactly identifies the same result with minimum efforts. We further extend the result to a more generic one: for any network topology and even though the network is loopy. A theoretical insight of our result is that the optimal routing is always shortest path routing with respect to some considerations of hot spots in the networks.Keywords: Conditional Probability Distribution, Congestion hotspots, Operational Networks, Traffic Engineering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988600 Brain Image Segmentation Using Conditional Random Field Based On Modified Artificial Bee Colony Optimization Algorithm
Authors: B. Thiagarajan, R. Bremananth
Abstract:
Tumor is an uncontrolled growth of tissues in any part of the body. Tumors are of different types and they have different characteristics and treatments. Brain tumor is inherently serious and life-threatening because of its character in the limited space of the intracranial cavity (space formed inside the skull). Locating the tumor within MR (magnetic resonance) image of brain is integral part of the treatment of brain tumor. This segmentation task requires classification of each voxel as either tumor or non-tumor, based on the description of the voxel under consideration. Many studies are going on in the medical field using Markov Random Fields (MRF) in segmentation of MR images. Even though the segmentation process is better, computing the probability and estimation of parameters is difficult. In order to overcome the aforementioned issues, Conditional Random Field (CRF) is used in this paper for segmentation, along with the modified artificial bee colony optimization and modified fuzzy possibility c-means (MFPCM) algorithm. This work is mainly focused to reduce the computational complexities, which are found in existing methods and aimed at getting higher accuracy. The efficiency of this work is evaluated using the parameters such as region non-uniformity, correlation and computation time. The experimental results are compared with the existing methods such as MRF with improved Genetic Algorithm (GA) and MRF-Artificial Bee Colony (MRF-ABC) algorithm.
Keywords: Conditional random field, Magnetic resonance, Markov random field, Modified artificial bee colony.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2948599 Color Image Segmentation and Multi-Level Thresholding by Maximization of Conditional Entropy
Authors: R.Sukesh Kumar, Abhisek Verma, Jasprit Singh
Abstract:
In this work a novel approach for color image segmentation using higher order entropy as a textural feature for determination of thresholds over a two dimensional image histogram is discussed. A similar approach is applied to achieve multi-level thresholding in both grayscale and color images. The paper discusses two methods of color image segmentation using RGB space as the standard processing space. The threshold for segmentation is decided by the maximization of conditional entropy in the two dimensional histogram of the color image separated into three grayscale images of R, G and B. The features are first developed independently for the three ( R, G, B ) spaces, and combined to get different color component segmentation. By considering local maxima instead of the maximum of conditional entropy yields multiple thresholds for the same image which forms the basis for multilevel thresholding.Keywords: conditional entropy, multi-level thresholding, segmentation, two dimensional image histogram
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2998598 Developing of Fragility Curve for Two-Span Simply Supported Concrete Bridge in Near-Fault Area
Authors: S. Shirazian, M.R. Ghayamghamian, G.R. Nouri
Abstract:
Bridges are one of the main components of transportation networks. They should be functional before and after earthquake for emergency services. Therefore we need to assess seismic performance of bridges under different seismic loadings. Fragility curve is one of the popular tools in seismic evaluations. The fragility curves are conditional probability statements, which give the probability of a bridge reaching or exceeding a particular damage level for a given intensity level. In this study, the seismic performance of a two-span simply supported concrete bridge is assessed. Due to usual lack of empirical data, the analytical fragility curve was developed by results of the dynamic analysis of bridge subjected to the different time histories in near-fault area.Keywords: Fragility curve, Seismic behavior, Time historyanalysis, Transportation Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2796597 Comparative Approach of Measuring Price Risk on Romanian and International Wheat Market
Authors: Larisa N. Pop, Irina M. Ban
Abstract:
This paper aims to present the main instruments used in the economic literature for measuring the price risk, pointing out on the advantages brought by the conditional variance in this respect. The theoretical approach will be exemplified by elaborating an EGARCH model for the price returns of wheat, both on Romanian and on international market. To our knowledge, no previous empirical research, either on price risk measurement for the Romanian markets or studies that use the ARIMA-EGARCH methodology, have been conducted. After estimating the corresponding models, the paper will compare the estimated conditional variance on the two markets.Keywords: conditional variance, GARCH models, price risk, volatility
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1449596 Optimization of SAD Algorithm on VLIW DSP
Authors: Hui-Jae You, Sun-Tae Chung, Souhwan Jung
Abstract:
SAD (Sum of Absolute Difference) algorithm is heavily used in motion estimation which is computationally highly demanding process in motion picture encoding. To enhance the performance of motion picture encoding on a VLIW processor, an efficient implementation of SAD algorithm on the VLIW processor is essential. SAD algorithm is programmed as a nested loop with a conditional branch. In VLIW processors, loop is usually optimized by software pipelining, but researches on optimal scheduling of software pipelining for nested loops, especially nested loops with conditional branches are rare. In this paper, we propose an optimal scheduling and implementation of SAD algorithm with conditional branch on a VLIW DSP processor. The proposed optimal scheduling first transforms the nested loop with conditional branch into a single loop with conditional branch with consideration of full utilization of ILP capability of the VLIW processor and realization of earlier escape from the loop. Next, the proposed optimal scheduling applies a modulo scheduling technique developed for single loop. Based on this optimal scheduling strategy, optimal implementation of SAD algorithm on TMS320C67x, a VLIW DSP is presented. Through experiments on TMS320C6713 DSK, it is shown that H.263 encoder with the proposed SAD implementation performs better than other H.263 encoder with other SAD implementations, and that the code size of the optimal SAD implementation is small enough to be appropriate for embedded environments.Keywords: Optimal implementation, SAD algorithm, VLIW, TMS320C6713.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2346595 Modelling Conditional Volatility of Saving Rate by a Time-Varying Parameter Model
Authors: Katleho D. Makatjane, Kalebe M. Kalebe
Abstract:
The present paper used time-varying parameters which are based on the score function of a probability density at time t to model volatility of saving rate. We used a scaled likelihood function to update the parameters of the model overtime. Our results revealed high diligence of time-varying since the location parameter is greater than zero. Furthermore, we discovered a leptokurtic condition on saving rate’s distribution. Kapetanios, Shin-Shell Nonlinear Augmented Dickey-Fuller (KSS-NADF) test showed that the saving rate has a nonlinear unit root; therefore, it can be modeled by a generalised autoregressive score (GAS) model. Additionally, value at risk (VaR) and conditional tail expectation (CTE) indicate that 99% of the time people in Lesotho are saving more than spending. This puts the economy in high risk of not expanding. Therefore, the monetary policy committee (MPC) of Lesotho should revise their monetary policies towards this high saving rates risk.
Keywords: Generalized autoregressive score, time-varying, saving rate, Lesotho.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 619594 Surveillance Video Summarization Based on Histogram Differencing and Sum Conditional Variance
Authors: Nada Jasim Habeeb, Rana Saad Mohammed, Muntaha Khudair Abbass
Abstract:
For more efficient and fast video summarization, this paper presents a surveillance video summarization method. The presented method works to improve video summarization technique. This method depends on temporal differencing to extract most important data from large video stream. This method uses histogram differencing and Sum Conditional Variance which is robust against to illumination variations in order to extract motion objects. The experimental results showed that the presented method gives better output compared with temporal differencing based summarization techniques.
Keywords: Temporal differencing, video summarization, histogram differencing, sum conditional variance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695593 Forecasting Electricity Spot Price with Generalized Long Memory Modeling: Wavelet and Neural Network
Authors: Souhir Ben Amor, Heni Boubaker, Lotfi Belkacem
Abstract:
This aims of this paper is to forecast the electricity spot prices. First, we focus on modeling the conditional mean of the series so we adopt a generalized fractional -factor Gegenbauer process (k-factor GARMA). Secondly, the residual from the -factor GARMA model has used as a proxy for the conditional variance; these residuals were predicted using two different approaches. In the first approach, a local linear wavelet neural network model (LLWNN) has developed to predict the conditional variance using the Back Propagation learning algorithms. In the second approach, the Gegenbauer generalized autoregressive conditional heteroscedasticity process (G-GARCH) has adopted, and the parameters of the k-factor GARMA-G-GARCH model has estimated using the wavelet methodology based on the discrete wavelet packet transform (DWPT) approach. The empirical results have shown that the k-factor GARMA-G-GARCH model outperform the hybrid k-factor GARMA-LLWNN model, and find it is more appropriate for forecasts.Keywords: k-factor, GARMA, LLWNN, G-GARCH, electricity price, forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 995592 Computer Verification in Cryptography
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.
Keywords: prime numbers, primality tests, (conditional) proba¬bility distributions, formal proof system, higher-order logic, formal verification, Bayes' Formula, Miller-Rabin primality test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181591 The Possibility-Probability Relationship for Bloodstream Concentrations of Physiologically Active Substances
Authors: Arkady Bolotin
Abstract:
If a possibility distribution and a probability distribution are describing values x of one and the same system or process x(t), can they relate to each other? Though in general the possibility and probability distributions might be not connected at all, we can assume that in some particular cases there is an association linked them. In the presented paper, we consider distributions of bloodstream concentrations of physiologically active substances and propose that the probability to observe a concentration x of a substance X can be produced from the possibility of the event X = x . The proposed assumptions and resulted theoretical distributions are tested against the data obtained from various panel studies of the bloodstream concentrations of the different physiologically active substances in patients and healthy adults as well.Keywords: Possibility distributions, possibility-probability relationship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1190590 An Overview of Handoff Techniques in Cellular Networks
Authors: Nasıf Ekiz, Tara Salih, Sibel Küçüköner, Kemal Fidanboylu
Abstract:
Continuation of an active call is one of the most important quality measurements in the cellular systems. Handoff process enables a cellular system to provide such a facility by transferring an active call from one cell to another. Different approaches are proposed and applied in order to achieve better handoff service. The principal parameters used to evaluate handoff techniques are: forced termination probability and call blocking probability. The mechanisms such as guard channels and queuing handoff calls decrease the forced termination probability while increasing the call blocking probability. In this paper we present an overview about the issues related to handoff initiation and decision and discuss about different types of handoff techniques available in the literature.
Keywords: Handoff, Forced Termination Probability, Blocking probability, Handoff Initiation, Handoff Decision, Handoff Prioritization Schemes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6615589 Computation of Probability Coefficients using Binary Decision Diagram and their Application in Test Vector Generation
Authors: Ashutosh Kumar Singh, Anand Mohan
Abstract:
This paper deals with efficient computation of probability coefficients which offers computational simplicity as compared to spectral coefficients. It eliminates the need of inner product evaluations in determination of signature of a combinational circuit realizing given Boolean function. The method for computation of probability coefficients using transform matrix, fast transform method and using BDD is given. Theoretical relations for achievable computational advantage in terms of required additions in computing all 2n probability coefficients of n variable function have been developed. It is shown that for n ≥ 5, only 50% additions are needed to compute all probability coefficients as compared to spectral coefficients. The fault detection techniques based on spectral signature can be used with probability signature also to offer computational advantage.Keywords: Binary Decision Diagrams, Spectral Coefficients, Fault detection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1465588 Spatial Time Series Models for Rice and Cassava Yields Based On Bayesian Linear Mixed Models
Authors: Panudet Saengseedam, Nanthachai Kantanantha
Abstract:
This paper proposes a linear mixed model (LMM) with spatial effects to forecast rice and cassava yields in Thailand at the same time. A multivariate conditional autoregressive (MCAR) model is assumed to present the spatial effects. A Bayesian method is used for parameter estimation via Gibbs sampling Markov Chain Monte Carlo (MCMC). The model is applied to the rice and cassava yields monthly data which have been extracted from the Office of Agricultural Economics, Ministry of Agriculture and Cooperatives of Thailand. The results show that the proposed model has better performance in most provinces in both fitting part and validation part compared to the simple exponential smoothing and conditional auto regressive models (CAR) from our previous study.
Keywords: Bayesian method, Linear mixed model, Multivariate conditional autoregressive model, Spatial time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247587 Determination of Sensitive Transmission Lines Due to the Effect of Protection System Hidden Failure in a Critical System Cascading Collapse
Authors: N. A. Salim, M. M. Othman, I. Musirin, M. S. Serwan
Abstract:
Protection system hidden failures have been identified as one of the main causes of system cascading collapse resulting to power system instability. In this paper, a systematic approach is presented in order to identify the probability of a system cascading collapse by taking into consideration the effect of protection system hidden failure. This includes the accurate calculation of the probability of hidden failure as it will provide significant impinge on the findings of the probability of system cascading collapse. The probability of a system cascading collapse is then used to identify the initial tripping of sensitive transmission lines which will contribute to a critical system cascading collapse. Based on the results obtained from this study, it is important to decide on the accurate value of the hidden failure probability as it will affect the probability of a system cascading collapse.
Keywords: Critical system cascading collapse, hidden failure, probability of cascading collapse, sensitive transmission lines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785586 An Approaching Index to Evaluate a forward Collision Probability
Authors: Yuan-Lin Chen
Abstract:
This paper presents an approaching forward collision probability index (AFCPI) for alerting and assisting driver in keeping safety distance to avoid the forward collision accident in highway driving. The time to collision (TTC) and time headway (TH) are used to evaluate the TTC forward collision probability index (TFCPI) and the TH forward collision probability index (HFCPI), respectively. The Mamdani fuzzy inference algorithm is presented combining TFCPI and HFCPI to calculate the approaching collision probability index of the vehicle. The AFCPI is easier to understand for the driver who did not even have any professional knowledge in vehicle professional field. At the same time, the driver’s behavior is taken into account for suiting each driver. For the approaching index, the value 0 is indicating the 0% probability of forward collision, and the values 0.5 and 1 are indicating the 50% and 100% probabilities of forward collision, respectively. The AFCPI is useful and easy-to-understand for alerting driver to avoid the forward collision accidents when driving in highway.
Keywords: Approaching index, forward collision probability, time to collision, time headway.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1160585 Ruin Probability for a Markovian Risk Model with Two-type Claims
Authors: Dongdong Zhang, Deran Zhang
Abstract:
In this paper, a Markovian risk model with two-type claims is considered. In such a risk model, the occurrences of the two type claims are described by two point processes {Ni(t), t ¸ 0}, i = 1, 2, where {Ni(t), t ¸ 0} is the number of jumps during the interval (0, t] for the Markov jump process {Xi(t), t ¸ 0} . The ruin probability ª(u) of a company facing such a risk model is mainly discussed. An integral equation satisfied by the ruin probability ª(u) is obtained and the bounds for the convergence rate of the ruin probability ª(u) are given by using key-renewal theorem.
Keywords: Risk model, ruin probability, Markov jump process, integral equation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1366584 Application of Adaptive Genetic Algorithm in Function Optimization
Authors: Panpan Xu, Shulin Sui
Abstract:
The crossover probability and mutation probability are the two important factors in genetic algorithm. The adaptive genetic algorithm can improve the convergence performance of genetic algorithm, in which the crossover probability and mutation probability are adaptively designed with the changes of fitness value. We apply adaptive genetic algorithm into a function optimization problem. The numerical experiment represents that adaptive genetic algorithm improves the convergence speed and avoids local convergence.
Keywords: Genetic algorithm, Adaptive genetic algorithm, Function optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723583 Determination of the Best Fit Probability Distribution for Annual Rainfall in Karkheh River at Iran
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best-fit probability distribution of annual rainfall based on 50 years sample (1966-2015) in the Karkheh river basin at Iran using six probability distributions: Normal, 2-Parameter Log Normal, 3-Parameter Log Normal, Pearson Type 3, Log Pearson Type 3 and Gumbel distribution. The best fit probability distribution was selected using Stormwater Management and Design Aid (SMADA) software and based on the Residual Sum of Squares (R.S.S) between observed and estimated values Based on the R.S.S values of fit tests, the Log Pearson Type 3 and then Pearson Type 3 distributions were found to be the best-fit probability distribution at the Jelogir Majin and Pole Zal rainfall gauging station. The annual values of expected rainfall were calculated using the best fit probability distributions and can be used by hydrologists and design engineers in future research at studied region and other region in the world.
Keywords: Log Pearson Type 3, SMADA, rainfall, Karkheh River.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 754582 A Case Study on the Numerical-Probability Approach for Deep Excavation Analysis
Authors: Komeil Valipourian
Abstract:
Urban advances and the growing need for developing infrastructures has increased the importance of deep excavations. In this study, after the introducing probability analysis as an important issue, an attempt has been made to apply it for the deep excavation project of Bangkok’s Metro as a case study. For this, the numerical probability model has been developed based on the Finite Difference Method and Monte Carlo sampling approach. The results indicate that disregarding the issue of probability in this project will result in an inappropriate design of the retaining structure. Therefore, probabilistic redesign of the support is proposed and carried out as one of the applications of probability analysis. A 50% reduction in the flexural strength of the structure increases the failure probability just by 8% in the allowable range and helps improve economic conditions, while maintaining mechanical efficiency. With regard to the lack of efficient design in most deep excavations, by considering geometrical and geotechnical variability, an attempt was made to develop an optimum practical design standard for deep excavations based on failure probability. On this basis, a practical relationship is presented for estimating the maximum allowable horizontal displacement, which can help improve design conditions without developing the probability analysis.
Keywords: Numerical probability modeling, deep excavation, allowable maximum displacement, finite difference method, FDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 692581 Probability of Globality
Authors: Eva Eggeling, Dieter W. Fellner, Torsten Ullrich
Abstract:
The objective of global optimization is to find the globally best solution of a model. Nonlinear models are ubiquitous in many applications and their solution often requires a global search approach; i.e. for a function f from a set A ⊂ Rn to the real numbers, an element x0 ∈ A is sought-after, such that ∀ x ∈ A : f(x0) ≤ f(x). Depending on the field of application, the question whether a found solution x0 is not only a local minimum but a global one is very important. This article presents a probabilistic approach to determine the probability of a solution being a global minimum. The approach is independent of the used global search method and only requires a limited, convex parameter domain A as well as a Lipschitz continuous function f whose Lipschitz constant is not needed to be known.Keywords: global optimization, probability theory, probability of globality
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582580 Credit Spread Changes and Volatility Spillover Effects
Authors: Thomas I. Kounitis
Abstract:
The purpose of this paper is to investigate the influence of a number of variables on the conditional mean and conditional variance of credit spread changes. The empirical analysis in this paper is conducted within the context of bivariate GARCH-in- Mean models, using the so-called BEKK parameterization. We show that credit spread changes are determined by interest-rate and equityreturn variables, which is in line with theory as provided by the structural models of default. We also identify the credit spread change volatility as an important determinant of credit spread changes, and provide evidence on the transmission of volatility between the variables under study.Keywords: Credit spread changes, GARCH-in-Mean models, structural framework, volatility transmission.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653579 Approximation for Average Error Probability of BPSK in the Presence of Phase Error
Authors: Yeonsoo Jang, Dongweon Yoon, Ki Ho Kwon, Jaeyoon Lee, Wooju Lee
Abstract:
Phase error in communications systems degrades error performance. In this paper, we present a simple approximation for the average error probability of the binary phase shift keying (BPSK) in the presence of phase error having a uniform distribution on arbitrary intervals. For the simple approximation, we use symmetry and periodicity of a sinusoidal function. Approximate result for the average error probability is derived, and the performance is verified through comparison with simulation result.Keywords: Average error probability, Phase shift keying, Phase error
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2049578 The Giant Component in a Random Subgraph of a Weak Expander
Authors: Yilun Shang
Abstract:
In this paper, we investigate the appearance of the giant component in random subgraphs G(p) of a given large finite graph family Gn = (Vn, En) in which each edge is present independently with probability p. We show that if the graph Gn satisfies a weak isoperimetric inequality and has bounded degree, then the probability p under which G(p) has a giant component of linear order with some constant probability is bounded away from zero and one. In addition, we prove the probability of abnormally large order of the giant component decays exponentially. When a contact graph is modeled as Gn, our result is of special interest in the study of the spread of infectious diseases or the identification of community in various social networks.
Keywords: subgraph, expander, random graph, giant component, percolation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697577 The Locker Problem with Empty Lockers
Authors: David Avis, Luc Devroye, Kazuo Iwama
Abstract:
We consider a cooperative game played by n players against a referee. The players names are randomly distributed among n lockers, with one name per locker. Each player can open up to half the lockers and each player must find his name. Once the game starts the players may not communicate. It has been previously shown that, quite surprisingly, an optimal strategy exists for which the success probability is never worse than 1 − ln 2 ≈ 0.306. In this paper we consider an extension where the number of lockers is greater than the number of players, so that some lockers are empty. We show that the players may still win with positive probability even if there are a constant k number of empty lockers. We show that for each fixed probability p, there is a constant c so that the players can win with probability at least p if they are allowed to open cn lockers.
Keywords: Locker problem, pointer-following algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1293