Search results for: complement of probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1456

Search results for: complement of probability

1396 Young’s Modulus Variability: Influence on Masonry Vault Behavior

Authors: Abdelmounaim Zanaz, Sylvie Yotte, Fazia Fouchal, Alaa Chateauneuf

Abstract:

This paper presents a methodology for probabilistic assessment of bearing capacity and prediction of failure mechanism of masonry vaults at the ultimate state with consideration of the natural variability of Young’s modulus of stones. First, the computation model is explained. The failure mode is the most reported mode, i.e. the four-hinge mechanism. Based on this assumption, the study of a vault composed of 16 segments is presented. The Young’s modulus of the segments is considered as random variable defined by a mean value and a coefficient of variation CV. A relationship linking the vault bearing capacity to the modulus variation of voussoirs is proposed. The failure mechanisms, in addition to that observed in the deterministic case, are identified for each CV value as well as their probability of occurrence. The results show that the mechanism observed in the deterministic case has decreasing probability of occurrence in terms of CV, while the number of other mechanisms and their probability of occurrence increase with the coefficient of variation of Young’s modulus. This means that if a significant change in the Young modulus of the segments is proven, taken it into account in computations becomes mandatory, both for determining the vault bearing capacity and for predicting its failure mechanism.

Keywords: masonry, mechanism, probability, variability, vault

Procedia PDF Downloads 415
1395 A Strategy for the Application of Second-Order Monte Carlo Algorithms to Petroleum Exploration and Production Projects

Authors: Obioma Uche

Abstract:

Due to the recent volatility in oil & gas prices as well as increased development of non-conventional resources, it has become even more essential to critically evaluate the profitability of petroleum prospects prior to making any investment decisions. Traditionally, simple Monte Carlo (MC) algorithms have been used to randomly sample probability distributions of economic and geological factors (e.g. price, OPEX, CAPEX, reserves, productive life, etc.) in order to obtain probability distributions for profitability metrics such as Net Present Value (NPV). In recent years, second-order MC algorithms have been shown to offer an advantage over simple MC techniques due to the added consideration of uncertainties associated with the probability distributions of the relevant variables. Here, a strategy for the application of the second-order MC technique to a case study is demonstrated to analyze its effectiveness as a tool for portfolio management.

Keywords: Monte Carlo algorithms, portfolio management, profitability, risk analysis

Procedia PDF Downloads 299
1394 Modeling Binomial Dependent Distribution of the Values: Synthesis Tables of Probabilities of Errors of the First and Second Kind of Biometrics-Neural Network Authentication System

Authors: B. S.Akhmetov, S. T. Akhmetova, D. N. Nadeyev, V. Yu. Yegorov, V. V. Smogoonov

Abstract:

Estimated probabilities of errors of the first and second kind for nonideal biometrics-neural transducers 256 outputs, the construction of nomograms based error probability of 'own' and 'alien' from the mathematical expectation and standard deviation of the normalized measures Hamming.

Keywords: modeling, errors, probability, biometrics, neural network, authentication

Procedia PDF Downloads 459
1393 Random Access in IoT Using Naïve Bayes Classification

Authors: Alhusein Almahjoub, Dongyu Qiu

Abstract:

This paper deals with the random access procedure in next-generation networks and presents the solution to reduce total service time (TST) which is one of the most important performance metrics in current and future internet of things (IoT) based networks. The proposed solution focuses on the calculation of optimal transmission probability which maximizes the success probability and reduces TST. It uses the information of several idle preambles in every time slot, and based on it, it estimates the number of backlogged IoT devices using Naïve Bayes estimation which is a type of supervised learning in the machine learning domain. The estimation of backlogged devices is necessary since optimal transmission probability depends on it and the eNodeB does not have information about it. The simulations are carried out in MATLAB which verify that the proposed solution gives excellent performance.

Keywords: random access, LTE/LTE-A, 5G, machine learning, Naïve Bayes estimation

Procedia PDF Downloads 121
1392 Parameter Interactions in the Cumulative Prospect Theory: Fitting the Binary Choice Experiment Data

Authors: Elzbieta Babula, Juhyun Park

Abstract:

Tversky and Kahneman’s cumulative prospect theory assumes symmetric probability cumulation with regard to the reference point within decision weights. Theoretically, this model should be invariant under the change of the direction of probability cumulation. In the present study, this phenomenon is being investigated by creating a reference model that allows verifying the parameter interactions in the cumulative prospect theory specifications. The simultaneous parametric fitting of utility and weighting functions is applied to binary choice data from the experiment. The results show that the flexibility of the probability weighting function is a crucial characteristic allowing to prevent parameter interactions while estimating cumulative prospect theory.

Keywords: binary choice experiment, cumulative prospect theory, decision weights, parameter interactions

Procedia PDF Downloads 187
1391 Optimal Mitigation of Slopes by Probabilistic Methods

Authors: D. De-León-Escobedo, D. J. Delgado-Hernández, S. Pérez

Abstract:

A probabilistic formulation to assess the slopes safety under the hazard of strong storms is presented and illustrated through a slope in Mexico. The formulation is based on the classical safety factor (SF) used in practice to appraise the slope stability, but it is introduced the treatment of uncertainties, and the slope failure probability is calculated as the probability that SF<1. As the main hazard is the rainfall on the area, statistics of rainfall intensity and duration are considered and modeled with an exponential distribution. The expected life-cycle cost is assessed by considering a monetary value on the slope failure consequences. Alternative mitigation measures are simulated, and the formulation is used to get the measures driving to the optimal one (minimum life-cycle costs). For the example, the optimal mitigation measure is the reduction on the slope inclination angle.

Keywords: expected life-cycle cost, failure probability, slopes failure, storms

Procedia PDF Downloads 128
1390 A Statistical Model for the Dynamics of Single Cathode Spot in Vacuum Cylindrical Cathode

Authors: Po-Wen Chen, Jin-Yu Wu, Md. Manirul Ali, Yang Peng, Chen-Te Chang, Der-Jun Jan

Abstract:

Dynamics of cathode spot has become a major part of vacuum arc discharge with its high academic interest and wide application potential. In this article, using a three-dimensional statistical model, we simulate the distribution of the ignition probability of a new cathode spot occurring in different magnetic pressure on old cathode spot surface and at different arcing time. This model for the ignition probability of a new cathode spot was proposed in two typical situations, one by the pure isotropic random walk in the absence of an external magnetic field, other by the retrograde motion in external magnetic field, in parallel with the cathode surface. We mainly focus on developed relationship between the ignition probability density distribution of a new cathode spot and the external magnetic field.

Keywords: cathode spot, vacuum arc discharge, transverse magnetic field, random walk

Procedia PDF Downloads 405
1389 The Impact of Governance Criteria in the Supplier Selection Process of Large German Companies

Authors: Christoph Köster

Abstract:

Supplier selection is one of the key challenges in supply chain management and can be considered a multi-criteria decision-making (MCDM) problem. In the 1960s, it evolved from considering only economic criteria, such as price, quality, and performance, to including environmental and social criteria nowadays. Although receiving considerable attention from scholars and practitioners over the past decades, existing research has not considered governance criteria so far. This is, however, surprising, as ESG (environmental, social, and governance) criteria have gained considerable attention. In order to complement ESG criteria in the supplier selection process, this study investigates German DAX and MDAX companies and evaluates the impact of governance criteria along their supplier selection process. Moreover, it proposes a set of criteria for the respective process steps. Specifically, eleven criteria for the first process step and five criteria for the second process step are identified. This paper contributes to a better understanding of the supplier selection process by elucidating the relevance of governance criteria in the supplier selection process and providing a set of empirically developed governance criteria. These results can be applied by practitioners to complement the criteria set in the supplier selection process and thus balance economic, environmental, social, and governance targets.

Keywords: ESG, governance, sustainable supplier selection, sustainability

Procedia PDF Downloads 90
1388 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.

Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability

Procedia PDF Downloads 243
1387 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 46
1386 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs

Authors: Lokesh Varshney, R. K. Saket

Abstract:

This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.

Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation

Procedia PDF Downloads 533
1385 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 326
1384 Graphical Modeling of High Dimension Processes with an Environmental Application

Authors: Ali S. Gargoum

Abstract:

Graphical modeling plays an important role in providing efficient probability calculations in high dimensional problems (computational efficiency). In this paper, we address one of such problems where we discuss fragmenting puff models and some distributional assumptions concerning models for the instantaneous, emission readings and for the fragmenting process. A graphical representation in terms of a junction tree of the conditional probability breakdown of puffs and puff fragments is proposed.

Keywords: graphical models, influence diagrams, junction trees, Bayesian nets

Procedia PDF Downloads 370
1383 Location-Domination on Join of Two Graphs and Their Complements

Authors: Analen Malnegro, Gina Malacas

Abstract:

Dominating sets and related topics have been studied extensively in the past few decades. A dominating set of a graph G is a subset D of V such that every vertex not in D is adjacent to at least one member of D. The domination number γ(G) is the number of vertices in a smallest dominating set for G. Some problems involving detection devices can be modeled with graphs. Finding the minimum number of devices needed according to the type of devices and the necessity of locating the object gives rise to locating-dominating sets. A subset S of vertices of a graph G is called locating-dominating set, LD-set for short, if it is a dominating set and if every vertex v not in S is uniquely determined by the set of neighbors of v belonging to S. The location-domination number λ(G) is the minimum cardinality of an LD-set for G. The complement of a graph G is a graph Ḡ on same vertices such that two distinct vertices of Ḡ are adjacent if and only if they are not adjacent in G. An LD-set of a graph G is global if it is an LD-set of both G and its complement Ḡ. The global location-domination number λg(G) is defined as the minimum cardinality of a global LD-set of G. In this paper, global LD-sets on the join of two graphs are characterized. Global location-domination numbers of these graphs are also determined.

Keywords: dominating set, global locating-dominating set, global location-domination number, locating-dominating set, location-domination number

Procedia PDF Downloads 154
1382 On Virtual Coordination Protocol towards 5G Interference Mitigation: Modelling and Performance Analysis

Authors: Bohli Afef

Abstract:

The fifth-generation (5G) wireless systems is featured by extreme densities of cell stations to overcome the higher future demand. Hence, interference management is a crucial challenge in 5G ultra-dense cellular networks. In contrast to the classical inter-cell interference coordination approach, which is no longer fit for the high density of cell-tiers, this paper proposes a novel virtual coordination based on the dynamic common cognitive monitor channel protocol to deal with the inter-cell interference issue. A tractable and flexible model for the coverage probability of a typical user is developed through the use of the stochastic geometry model. The analyses of the performance of the suggested protocol are illustrated both analytically and numerically in terms of coverage probability.

Keywords: ultra dense heterogeneous networks, dynamic common channel protocol, cognitive radio, stochastic geometry, coverage probability

Procedia PDF Downloads 297
1381 Exact Solutions for Steady Response of Nonlinear Systems under Non-White Excitation

Authors: Yaping Zhao

Abstract:

In the present study, the exact solutions for the steady response of quasi-linear systems under non-white wide-band random excitation are considered by means of the stochastic averaging method. The non linearity of the systems contains the power-law damping and the cross-product term of the power-law damping and displacement. The drift and diffusion coefficients of the Fokker-Planck-Kolmogorov (FPK) equation after averaging are obtained by a succinct approach. After solving the averaged FPK equation, the joint probability density function and the marginal probability density function in steady state are attained. In the process of resolving, the eigenvalue problem of ordinary differential equation is handled by integral equation method. Some new results are acquired and the novel method to deal with the problems in nonlinear random vibration is proposed.

Keywords: random vibration, stochastic averaging method, FPK equation, transition probability density

Procedia PDF Downloads 470
1380 Wind Fragility for Honeycomb Roof Cladding Panels Using Screw Pull-Out Capacity

Authors: Viriyavudh Sim, Woo Young Jung

Abstract:

The failure of roof cladding mostly occurs due to the failing of the connection between claddings and purlins, which is the pull-out of the screw connecting the two parts when the pull-out load, i.e. typhoon, is higher than the resistance of the connection screw. As typhoon disasters in Korea are constantly on the rise, probability risk assessment (PRA) has become a vital tool to evaluate the performance of civil structures. In this study, we attempted to determine the fragility of roof cladding with the screw connection. Experimental study was performed to evaluate the pull-out resistance of screw joints between honeycomb panels and back frames. Subsequently, by means of Monte Carlo Simulation method, probability of failure for these types of roof cladding was determined. The results that the failure of roof cladding was depends on their location on the roof, for example, the edge most panel has the highest probability of failure.

Keywords: Monte Carlo Simulation, roof cladding, screw pull-out strength, wind fragility

Procedia PDF Downloads 229
1379 A Packet Loss Probability Estimation Filter Using Most Recent Finite Traffic Measurements

Authors: Pyung Soo Kim, Eung Hyuk Lee, Mun Suck Jang

Abstract:

A packet loss probability (PLP) estimation filter with finite memory structure is proposed to estimate the packet rate mean and variance of the input traffic process in real-time while removing undesired system and measurement noises. The proposed PLP estimation filter is developed under a weighted least square criterion using only the finite traffic measurements on the most recent window. The proposed PLP estimation filter is shown to have several inherent properties such as unbiasedness, deadbeat, robustness. A guideline for choosing appropriate window length is described since it can affect significantly the estimation performance. Using computer simulations, the proposed PLP estimation filter is shown to be superior to the Kalman filter for the temporarily uncertain system. One possible explanation for this is that the proposed PLP estimation filter can have greater convergence time of a filtered estimate as the window length M decreases.

Keywords: packet loss probability estimation, finite memory filter, infinite memory filter, Kalman filter

Procedia PDF Downloads 637
1378 Wind Power Density and Energy Conversion in Al-Adwas Ras-Huwirah Area, Hadhramout, Yemen

Authors: Bawadi M. A., Abbad J. A., Baras E. A.

Abstract:

This study was conducted to assess wind energy resources in the area of Al-Adwas Ras-Huwirah Hadhramout Governorate, Yemen, through using statistical calculations, the Weibull model and SPSS program were used in the monthly and the annual to analyze the wind energy resource; the convergence of wind energy; turbine efficiency in the selected area. Wind speed data was obtained from NASA over a period of ten years (2010-2019) and at heights of 50 m above ground level. Probability distributions derived from wind data and their distribution parameters are determined. The density probability function is fitted to the measured probability distributions on an annual basis. This study also involves locating preliminary sites for wind farms using Geographic Information System (GIS) technology. This further leads to maximizing the output energy from the most suitable wind turbines in the proposed site.

Keywords: wind speed analysis, Yemen wind energy, wind power density, Weibull distribution model

Procedia PDF Downloads 54
1377 The Control of Wall Thickness Tolerance during Pipe Purchase Stage Based on Reliability Approach

Authors: Weichao Yu, Kai Wen, Weihe Huang, Yang Yang, Jing Gong

Abstract:

Metal-loss corrosion is a major threat to the safety and integrity of gas pipelines as it may result in the burst failures which can cause severe consequences that may include enormous economic losses as well as the personnel casualties. Therefore, it is important to ensure the corroding pipeline integrity and efficiency, considering the value of wall thickness, which plays an important role in the failure probability of corroding pipeline. Actually, the wall thickness is controlled during pipe purchase stage. For example, the API_SPEC_5L standard regulates the allowable tolerance of the wall thickness from the specified value during the pipe purchase. The allowable wall thickness tolerance will be used to determine the wall thickness distribution characteristic such as the mean value, standard deviation and distribution. Taking the uncertainties of the input variables in the burst limit-state function into account, the reliability approach rather than the deterministic approach will be used to evaluate the failure probability. Moreover, the cost of pipe purchase will be influenced by the allowable wall thickness tolerance. More strict control of the wall thickness usually corresponds to a higher pipe purchase cost. Therefore changing the wall thickness tolerance will vary both the probability of a burst failure and the cost of the pipe. This paper describes an approach to optimize the wall thickness tolerance considering both the safety and economy of corroding pipelines. In this paper, the corrosion burst limit-state function in Annex O of CSAZ662-7 is employed to evaluate the failure probability using the Monte Carlo simulation technique. By changing the allowable wall thickness tolerance, the parameters of the wall thickness distribution in the limit-state function will be changed. Using the reliability approach, the corresponding variations in the burst failure probability will be shown. On the other hand, changing the wall thickness tolerance will lead to a change in cost in pipe purchase. Using the variation of the failure probability and pipe cost caused by changing wall thickness tolerance specification, the optimal allowable tolerance can be obtained, and used to define pipe purchase specifications.

Keywords: allowable tolerance, corroding pipeline segment, operation cost, production cost, reliability approach

Procedia PDF Downloads 361
1376 Influence of Maximum Fatigue Load on Probabilistic Aspect of Fatigue Crack Propagation Life at Specified Grown Crack in Magnesium Alloys

Authors: Seon Soon Choi

Abstract:

The principal purpose of this paper is to find the influence of maximum fatigue load on the probabilistic aspect of fatigue crack propagation life at a specified grown crack in magnesium alloys. The experiments of fatigue crack propagation are carried out in laboratory air under different conditions of the maximum fatigue loads to obtain the fatigue crack propagation data for the statistical analysis. In order to analyze the probabilistic aspect of fatigue crack propagation life, the goodness-of fit test for probability distribution of the fatigue crack propagation life at a specified grown crack is implemented through Anderson-Darling test. The good probability distribution of the fatigue crack propagation life is also verified under the conditions of the maximum fatigue loads.

Keywords: fatigue crack propagation life, magnesium alloys, maximum fatigue load, probability

Procedia PDF Downloads 358
1375 A Simplified Distribution for Nonlinear Seas

Authors: M. A. Tayfun, M. A. Alkhalidi

Abstract:

The exact theoretical expression describing the probability distribution of nonlinear sea-surface elevations derived from the second-order narrowband model has a cumbersome form that requires numerical computations, not well-disposed to theoretical or practical applications. Here, the same narrowband model is re-examined to develop a simpler closed-form approximation suitable for theoretical and practical applications. The salient features of the approximate form are explored, and its relative validity is verified with comparisons to other readily available approximations, and oceanic data.

Keywords: ocean waves, probability distributions, second-order nonlinearities, skewness coefficient, wave steepness

Procedia PDF Downloads 407
1374 Prioritized Processor-Sharing with a Maximum Permissible Sojourn Time

Authors: Yoshiaki Shikata

Abstract:

A prioritized processor-sharing (PS) system with a maximum permissible sojourn time (MPST) is proposed. In this PS system, a higher-priority request is allocated a larger service ratio than a lower-priority request. Moreover, each request receiving service is guaranteed the maximum permissible sojourn time determined by each priority class, regardless of its service time. Arriving requests that cannot receive service due to this guarantee are rejected. We further propose a guarantee method for implementing such a system, and discuss performance evaluation procedures for the resulting system. Practical performance measures, such as the relationships between the loss probability or mean sojourn time of each class request and the maximum permissible sojourn time are evaluated via simulation. At the arrival of each class request, its acceptance or rejection is judged using extended sojourn times of all requests receiving service in the server. As the MPST increases, the mean sojourn time increases almost linearly. However, the logarithm of the loss probability decreases almost linearly. Moreover with an MPST, the difference in the mean sojourn time for different MPSTs increases with the traffic rate. Conversely, the difference in the loss probability for different MPSTs decreases as the traffic rate increases.

Keywords: prioritized processor sharing, priority ratio, permissible sojourn time, loss probability, mean sojourn time, simulation

Procedia PDF Downloads 164
1373 Accelerated Structural Reliability Analysis under Earthquake-Induced Tsunamis by Advanced Stochastic Simulation

Authors: Sai Hung Cheung, Zhe Shao

Abstract:

Recent earthquake-induced tsunamis in Padang, 2004 and Tohoku, 2011 brought huge losses of lives and properties. Maintaining vertical evacuation systems is the most crucial strategy to effectively reduce casualty during the tsunami event. Thus, it is of our great interest to quantify the risk to structural dynamic systems due to earthquake-induced tsunamis. Despite continuous advancement in computational simulation of the tsunami and wave-structure interaction modeling, it still remains computationally challenging to evaluate the reliability (or its complement failure probability) of a structural dynamic system when uncertainties related to the system and its modeling are taken into account. The failure of the structure in a tsunami-wave-structural system is defined as any response quantities of the system exceeding specified thresholds during the time when the structure is subjected to dynamic wave impact due to earthquake-induced tsunamis. In this paper, an approach based on a novel integration of the Subset Simulation algorithm and a recently proposed moving least squares response surface approach for stochastic sampling is proposed. The effectiveness of the proposed approach is discussed by comparing its results with those obtained from the Subset Simulation algorithm without using the response surface approach.

Keywords: response surface model, subset simulation, structural reliability, Tsunami risk

Procedia PDF Downloads 343
1372 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy

Authors: Zviad Ghadua, Biswa Bhattacharya

Abstract:

The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.

Keywords: flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina

Procedia PDF Downloads 105
1371 Determining the Effects of Wind-Aided Midge Movement on the Probability of Coexistence of Multiple Bluetongue Virus Serotypes in Patchy Environments

Authors: Francis Mugabi, Kevin Duffy, Joseph J. Y. T Mugisha, Obiora Collins

Abstract:

Bluetongue virus (BTV) has 27 serotypes, with some of them coexisting in patchy (different) environments, which make its control difficult. Wind-aided midge movement is a known mechanism in the spread of BTV. However, its effects on the probability of coexistence of multiple BTV serotypes are not clear. Deterministic and stochastic models for r BTV serotypes in n discrete patches connected by midge and/or cattle movement are formulated and analyzed. For the deterministic model without midge and cattle movement, using the comparison principle, it is shown that if the patch reproduction number R0 < 1, i=1,2,...,n, j=1,2,...,r, all serotypes go extinct. If R^j_i0>1, competitive exclusion takes place. Using numerical simulations, it is shown that when the n patches are connected by midge movement, coexistence takes place. To account for demographic and movement variability, the deterministic model is transformed into a continuous-time Markov chain stochastic model. Utilizing a multitype branching process, it is shown that the midge movement can have a large effect on the probability of coexistence of multiple BTV serotypes. The probability of coexistence can be brought to zero when the control interventions that directly kill the adult midges are applied. These results indicate the significance of wind-aided midge movement and vector control interventions on the coexistence and control of multiple BTV serotypes in patchy environments.

Keywords: bluetongue virus, coexistence, multiple serotypes, midge movement, branching process

Procedia PDF Downloads 118
1370 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis

Authors: Petr Gurný

Abstract:

One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the credit-scoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.

Keywords: credit-scoring models, multidimensional subordinated Lévy model, probability of default

Procedia PDF Downloads 423
1369 Moderating Effect of Owner's Influence on the Relationship between the Probability of Client Failure and Going Concern Opinion Issuance

Authors: Mohammad Noor Hisham Osman, Ahmed Razman Abdul Latiff, Zaidi Mat Daud, Zulkarnain Muhamad Sori

Abstract:

The problem that Malaysian auditors do not issue going concern opinion (GC opinion) to seriously financially distressed companies is still a pressing issue. Policy makers, particularly the Financial Statement Review Committee (FSRC) of Malaysian Institute of Accountant, have raised this issue as early as in 2009. Similar problem happened in the US, UK, and many developing countries. It is important for auditors to issue GC opinion properly because such opinion is one signal about the viability of a company much needed by stakeholders. There are at least two unanswered questions or research gaps in the literature on determinants of GC opinion. Firstly, is client’s probability of failure associated with GC opinion issuance? Secondly, to what extent influential owners (management, family, and institution) moderate the association between client probability of failure and GC opinion issuance. The objective of this study is, therefore, twofold; (1) To examine the extent of the relationship between the probability of client failure and the issuance of GC opinion and (2) To examine the level of management, family, and institutional ownerships moderate the association between client probability of failure and the issuance of GC opinion. This study is quantitative in nature, and the sources of data are secondary (mainly company’s annual reports). A total of four hypotheses have been developed and tested on data accumulated from annual reports of seriously financially distressed Malaysian public listed companies. Data from 2006 to 2012 on a sample of 644 observations have been analyzed using panel logistic regression. It is found that certainty (rather than probability) of client failure affects the issuance of GC opinion. In addition, it is found that only the level of family ownership does positively moderate the relationship between client probability of failure and GC opinion issuance. This study is a contribution to auditing literature as its findings can enhance our understanding about audit quality; particularly on the variables that are associated with the issuance of GC opinion. The findings of this study shed light on the roles family owners in GC opinion issuance process, and this would open ways for the researcher to suggest measures that can be used to tackle the problem of auditors do not want to issue GC opinion to financially distressed clients. The measures to be suggested can be useful to policy makers in formulating future promulgations.

Keywords: audit quality, auditing, auditor characteristics, going concern opinion, Malaysia

Procedia PDF Downloads 228
1368 Cognitive Relaying in Interference Limited Spectrum Sharing Environment: Outage Probability and Outage Capacity

Authors: Md Fazlul Kader, Soo Young Shin

Abstract:

In this paper, we consider a cognitive relay network (CRN) in which the primary receiver (PR) is protected by peak transmit power $\bar{P}_{ST}$ and/or peak interference power Q constraints. In addition, the interference effect from the primary transmitter (PT) is considered to show its impact on the performance of the CRN. We investigate the outage probability (OP) and outage capacity (OC) of the CRN by deriving closed-form expressions over Rayleigh fading channel. Results show that both the OP and OC improve by increasing the cooperative relay nodes as well as when the PT is far away from the SR.

Keywords: cognitive relay, outage, interference limited, decode-and-forward (DF)

Procedia PDF Downloads 481
1367 Development of Visual Working Memory Precision: A Cross-Sectional Study of Simultaneously Delayed Responses Paradigm

Authors: Yao Fu, Xingli Zhang, Jiannong Shi

Abstract:

Visual working memory (VWM) capacity is the ability to maintain and manipulate short-term information which is not currently available. It is well known for its significance to form the basis of numerous cognitive abilities and its limitation in holding information. VWM span, the most popular measurable indicator, is found to reach the adult level (3-4 items) around 12-13 years’ old, while less is known about the precision development of the VWM capacity. By using simultaneously delayed responses paradigm, the present study investigates the development of VWM precision among 6-18-year-old children and young adults, besides its possible relationships with fluid intelligence and span. Results showed that precision and span both increased with age, and precision reached the maximum in 16-17 age-range. Moreover, when remembering 3 simultaneously presented items, the probability of remembering target item correlated with fluid intelligence and the probability of wrap errors (misbinding target and non-target items) correlated with age. When remembering more items, children had worse performance than adults due to their wrap errors. Compared to span, VWM precision was effective predictor of intelligence even after controlling for age. These results suggest that unlike VWM span, precision developed in a slow, yet longer fashion. Moreover, decreasing probability of wrap errors might be the main reason for the development of precision. Last, precision correlated more closely with intelligence than span in childhood and adolescence, which might be caused by the probability of remembering target item.

Keywords: fluid intelligence, precision, visual working memory, wrap errors

Procedia PDF Downloads 245