Search results for: default probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1377

Search results for: default probability

1347 Saliency Detection Using a Background Probability Model

Authors: Junling Li, Fang Meng, Yichun Zhang

Abstract:

Image saliency detection has been long studied, while several challenging problems are still unsolved, such as detecting saliency inaccurately in complex scenes or suppressing salient objects in the image borders. In this paper, we propose a new saliency detection algorithm in order to solving these problems. We represent the image as a graph with superixels as nodes. By considering appearance similarity between the boundary and the background, the proposed method chooses non-saliency boundary nodes as background priors to construct the background probability model. The probability that each node belongs to the model is computed, which measures its similarity with backgrounds. Thus we can calculate saliency by the transformed probability as a metric. We compare our algorithm with ten-state-of-the-art salient detection methods on the public database. Experimental results show that our simple and effective approach can attack those challenging problems that had been baffling in image saliency detection.

Keywords: visual saliency, background probability, boundary knowledge, background priors

Procedia PDF Downloads 430
1346 Simple Procedure for Probability Calculation of Tensile Crack Occurring in Rigid Pavement: A Case Study

Authors: Aleš Florian, Lenka Ševelová, Jaroslav Žák

Abstract:

Formation of tensile cracks in concrete slabs of rigid pavement can be (among others) the initiation point of the other, more serious failures which can ultimately lead to complete degradation of the concrete slab and thus the whole pavement. Two measures can be used for reliability assessment of this phenomenon - the probability of failure and/or the reliability index. Different methods can be used for their calculation. The simple ones are called moment methods and simulation techniques. Two methods - FOSM Method and Simple Random Sampling Method - are verified and their comparison is performed. The influence of information about the probability distribution and the statistical parameters of input variables as well as of the limit state function on the calculated reliability index and failure probability are studied in three points on the lower surface of concrete slabs of the older type of rigid pavement formerly used in the Czech Republic.

Keywords: failure, pavement, probability, reliability index, simulation, tensile crack

Procedia PDF Downloads 547
1345 Optimal Diversification and Bank Value Maximization

Authors: Chien-Chih Lin

Abstract:

This study argues that the optimal diversifications for the maximization of bank value are asymmetrical; they depend on the business cycle. During times of expansion, systematic risks are relatively low, and hence there is only a slight effect from raising them with a diversified portfolio. Consequently, the benefit of reducing individual risks dominates any loss from raising systematic risks, leading to a higher value for a bank by holding a diversified portfolio of assets. On the contrary, in times of recession, systematic risks are relatively high. It is more likely that the loss from raising systematic risks surpasses the benefit of reducing individual risks from portfolio diversification. Consequently, more diversification leads to lower bank values. Finally, some empirical evidence from the banks in Taiwan is provided.

Keywords: diversification, default probability, systemic risk, banking, business cycle

Procedia PDF Downloads 437
1344 Italian Central Guarantee Fund: An Analysis of the Guaranteed SMEs’ Default Risk

Authors: M. C. Arcuri, L. Gai, F. Ielasi

Abstract:

Italian Central Guarantee Fund (CGF) has the purpose to facilitate Small and Medium-sized Enterprises (SMEs)’ access to credit. The aim of the paper is to study the evaluation method adopted by the CGF with regard to SMEs requiring its intervention. This is even more important in the light of the recent CGF reform. We analyse an initial sample of more than 500.000 guarantees from 2012 to 2018. We distinguish between a counter-guarantee delivered to a mutual guarantee institution and a guarantee directly delivered to a bank. We investigate the impact of variables related to the operations and the SMEs on Altman Z’’-score and the score consistent with CGF methodology. We verify that the type of intervention affects the scores and the initial condition changes with the new assessment criterions. 

Keywords: banks, default risk, Italian guarantee fund, mutual guarantee institutions

Procedia PDF Downloads 176
1343 Pairwise Relative Primality of Integers and Independent Sets of Graphs

Authors: Jerry Hu

Abstract:

Let G = (V, E) with V = {1, 2, ..., k} be a graph, the k positive integers a₁, a₂, ..., ak are G-wise relatively prime if (aᵢ, aⱼ ) = 1 for {i, j} ∈ E. We use an inductive approach to give an asymptotic formula for the number of k-tuples of integers that are G-wise relatively prime. An exact formula is obtained for the probability that k positive integers are G-wise relatively prime. As a corollary, we also provide an exact formula for the probability that k positive integers have exactly r relatively prime pairs.

Keywords: graph, independent set, G-wise relatively prime, probability

Procedia PDF Downloads 93
1342 A Multidimensional Genetic Algorithm Applicable for Our VRP Variant Dealing with the Problems of Infrastructure Defaults SVRDP-CMTW: “Safety Vehicle Routing Diagnosis Problem with Control and Modified Time Windows”

Authors: Ben Mansour Mouin, Elloumi Abdelkarim

Abstract:

We will discuss the problem of routing a fleet of different vehicles from a central depot to different types of infrastructure-defaults with dynamic maintenance requests, modified time windows, and control of default maintained. For this reason, we propose a modified metaheuristicto to solve our mathematical model. SVRDP-CMTW is a variant VRP of an optimal vehicle plan that facilitates the maintenance task of different types of infrastructure-defaults. This task will be monitored after the maintenance, based on its priorities, the degree of danger associated with each default, and the neighborhood at the black-spots. We will present, in this paper, a multidimensional genetic algorithm “MGA” by detailing its characteristics, proposed mechanisms, and roles in our work. The coding of this algorithm represents the necessary parameters that characterize each infrastructure-default with the objective of minimizing a combination of cost, distance and maintenance times while satisfying the priority levels of the most urgent defaults. The developed algorithm will allow the dynamic integration of newly detected defaults at the execution time. This result will be displayed in our programmed interactive system at the routing time. This multidimensional genetic algorithm replaces N genetic algorithm to solve P different type problems of infrastructure defaults (instead of N algorithm for P problem we can solve in one multidimensional algorithm simultaneously who can solve all these problemsatonce).

Keywords: mathematical model, VRP, multidimensional genetic algorithm, metaheuristics

Procedia PDF Downloads 196
1341 Personality Traits, Probability of Marital Infidelity and Risk of Divorce

Authors: Bahareh Zare

Abstract:

The theory of the investment model of dating infidelity maintains that loyalty is an essential power within romantic relationships. Loyalty signifies both motivation and psychological attachment to maintain a relationship. This study examined the relationship between the Big Five Personality Factors (Extraversion, Neuroticism, Openness, Conscientiousness, and Agreeableness), probability of marital infidelity, and risk of divorce. The participants completed NEO-FFI, INFQ (infidelity questionnaire) and were interviewed by OHI (Oral History Interview). The results demonstrated that extraversion and agreeableness traits were significant predictors for the probability of infidelity and risk of divorce. In addition, conscientiousness predicted the probability of infidelity, while neuroticism predicted the risk of divorce.

Keywords: five factors personality, infidelity, risk of divorce, investment theory

Procedia PDF Downloads 94
1340 Effect of Specimen Thickness on Probability Distribution of Grown Crack Size in Magnesium Alloys

Authors: Seon Soon Choi

Abstract:

The fatigue crack growth is stochastic because of the fatigue behavior having an uncertainty and a randomness. Therefore, it is necessary to determine the probability distribution of a grown crack size at a specific fatigue crack propagation life for maintenance of structure as well as reliability estimation. The essential purpose of this study is to present the good probability distribution fit for the grown crack size at a specified fatigue life in a rolled magnesium alloy under different specimen thickness conditions. Fatigue crack propagation experiments are carried out in laboratory air under three conditions of specimen thickness using AZ31 to investigate a stochastic crack growth behavior. The goodness-of-fit test for probability distribution of a grown crack size under different specimen thickness conditions is performed by Anderson-Darling test. The effect of a specimen thickness on variability of a grown crack size is also investigated.

Keywords: crack size, fatigue crack propagation, magnesium alloys, probability distribution, specimen thickness

Procedia PDF Downloads 499
1339 Quantification of Methane Emissions from Solid Waste in Oman Using IPCC Default Methodology

Authors: Wajeeha A. Qazi, Mohammed-Hasham Azam, Umais A. Mehmood, Ghithaa A. Al-Mufragi, Noor-Alhuda Alrawahi, Mohammed F. M. Abushammala

Abstract:

Municipal Solid Waste (MSW) disposed in landfill sites decompose under anaerobic conditions and produce gases which mainly contain carbon dioxide (CO₂) and methane (CH₄). Methane has the potential of causing global warming 25 times more than CO₂, and can potentially affect human life and environment. Thus, this research aims to determine MSW generation and the annual CH₄ emissions from the generated waste in Oman over the years 1971-2030. The estimation of total waste generation was performed using existing models, while the CH₄ emissions estimation was performed using the intergovernmental panel on climate change (IPCC) default method. It is found that total MSW generation in Oman might be reached 3,089 Gg in the year 2030, which approximately produced 85 Gg of CH₄ emissions in the year 2030.

Keywords: methane, emissions, landfills, solid waste

Procedia PDF Downloads 510
1338 Daily Probability Model of Storm Events in Peninsular Malaysia

Authors: Mohd Aftar Abu Bakar, Noratiqah Mohd Ariff, Abdul Aziz Jemain

Abstract:

Storm Event Analysis (SEA) provides a method to define rainfalls events as storms where each storm has its own amount and duration. By modelling daily probability of different types of storms, the onset, offset and cycle of rainfall seasons can be determined and investigated. Furthermore, researchers from the field of meteorology will be able to study the dynamical characteristics of rainfalls and make predictions for future reference. In this study, four categories of storms; short, intermediate, long and very long storms; are introduced based on the length of storm duration. Daily probability models of storms are built for these four categories of storms in Peninsular Malaysia. The models are constructed by using Bernoulli distribution and by applying linear regression on the first Fourier harmonic equation. From the models obtained, it is found that daily probability of storms at the Eastern part of Peninsular Malaysia shows a unimodal pattern with high probability of rain beginning at the end of the year and lasting until early the next year. This is very likely due to the Northeast monsoon season which occurs from November to March every year. Meanwhile, short and intermediate storms at other regions of Peninsular Malaysia experience a bimodal cycle due to the two inter-monsoon seasons. Overall, these models indicate that Peninsular Malaysia can be divided into four distinct regions based on the daily pattern for the probability of various storm events.

Keywords: daily probability model, monsoon seasons, regions, storm events

Procedia PDF Downloads 345
1337 Can the Intervention of SCAMPER Bring about Changes of Neural Activation While Taking Creativity Tasks?

Authors: Yu-Chu Yeh, WeiChin Hsu, Chih-Yen Chang

Abstract:

Substitution, combination, modification, putting to other uses, elimination, and rearrangement (SCAMPER) has been regarded as an effective technique that provides a structured way to help people to produce creative ideas and solutions. Although some neuroscience studies regarding creativity training have been conducted, no study has focused on SCAMPER. This study therefore aimed at examining whether the learning of SCAMPER through video tutorials would result in alternations of neural activation. Thirty college students were randomly assigned to the experimental group or the control group. The experimental group was requested to watch SCAMPER videos, whereas the control group was asked to watch natural-scene videos which were regarded as neutral stimulating materials. Each participant was brain scanned in a Functional magnetic resonance imaging (fMRI) machine while undertaking a creativity test before and after watching the videos. Furthermore, a two-way ANOVA was used to analyze the interaction between groups (the experimental group; the control group) and tasks (C task; M task; X task). The results revealed that the left precuneus significantly activated in the interaction of groups and tasks, as well as in the main effect of group. Furthermore, compared with the control group, the experimental group had greater activation in the default mode network (left precuneus and left inferior parietal cortex) and the motor network (left postcentral gyrus and left supplementary area). The findings suggest that the SCAMPER training may facilitate creativity through the stimulation of the default mode network and the motor network.

Keywords: creativity, default mode network, neural activation, SCAMPER

Procedia PDF Downloads 101
1336 Computation of Radiotherapy Treatment Plans Based on CT to ED Conversion Curves

Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić

Abstract:

Radiotherapy treatment planning computers use CT data of the patient. For the computation of a treatment plan, treatment planning system must have an information on electron densities of tissues scanned by CT. This information is given by the conversion curve CT (CT number) to ED (electron density), or simply calibration curve. Every treatment planning system (TPS) has built in default CT to ED conversion curves, for the CTs of different manufacturers. However, it is always recommended to verify the CT to ED conversion curve before actual clinical use. Objective of this study was to check how the default curve already provided matches the curve actually measured on a specific CT, and how much it influences the calculation of a treatment planning computer. The examined CT scanners were from the same manufacturer, but four different scanners from three generations. The measurements of all calibration curves were done with the dedicated phantom CIRS 062M Electron Density Phantom. The phantom was scanned, and according to real HU values read at the CT console computer, CT to ED conversion curves were generated for different materials, for same tube voltage 140 kV. Another phantom, CIRS Thorax 002 LFC which represents an average human torso in proportion, density and two-dimensional structure, was used for verification. The treatment planning was done on CT slices of scanned CIRS LFC 002 phantom, for selected cases. Interest points were set in the lungs, and in the spinal cord, and doses recorded in TPS. The overall calculated treatment times for four scanners and default scanner did not differ more than 0.8%. Overall interest point dose in bone differed max 0.6% while for single fields was maximum 2.7% (lateral field). Overall interest point dose in lungs differed max 1.1% while for single fields was maximum 2.6% (lateral field). It is known that user should verify the CT to ED conversion curve, but often, developing countries are facing lack of QA equipment, and often use default data provided. We have concluded that the CT to ED curves obtained differ in certain points of a curve, generally in the region of higher densities. This influences the treatment planning result which is not significant, but definitely does make difference in the calculated dose.

Keywords: Computation of treatment plan, conversion curve, radiotherapy, electron density

Procedia PDF Downloads 489
1335 On Coverage Probability of Confidence Intervals for the Normal Mean with Known Coefficient of Variation

Authors: Suparat Niwitpong, Sa-aat Niwitpong

Abstract:

Statistical inference of normal mean with known coefficient of variation has been investigated recently. This phenomenon occurs normally in environment and agriculture experiments when the scientist knows the coefficient of variation of their experiments. In this paper, we constructed new confidence intervals for the normal population mean with known coefficient of variation. We also derived analytic expressions for the coverage probability of each confidence interval. To confirm our theoretical results, Monte Carlo simulation will be used to assess the performance of these intervals based on their coverage probabilities.

Keywords: confidence interval, coverage probability, expected length, known coefficient of variation

Procedia PDF Downloads 396
1334 Effect of Load Ratio on Probability Distribution of Fatigue Crack Propagation Life in Magnesium Alloys

Authors: Seon Soon Choi

Abstract:

It is necessary to predict a fatigue crack propagation life for estimation of structural integrity. Because of an uncertainty and a randomness of a structural behavior, it is also required to analyze stochastic characteristics of the fatigue crack propagation life at a specified fatigue crack size. The essential purpose of this study is to present the good probability distribution fit for the fatigue crack propagation life at a specified fatigue crack size in magnesium alloys under various fatigue load ratio conditions. To investigate a stochastic crack growth behavior, fatigue crack propagation experiments are performed in laboratory air under several conditions of fatigue load ratio using AZ31. By Anderson-Darling test, a goodness-of-fit test for probability distribution of the fatigue crack propagation life is performed and the good probability distribution fit for the fatigue crack propagation life is presented. The effect of load ratio on variability of fatigue crack propagation life is also investigated.

Keywords: fatigue crack propagation life, load ratio, magnesium alloys, probability distribution

Procedia PDF Downloads 650
1333 Stochastic Repair and Replacement with a Single Repair Channel

Authors: Mohammed A. Hajeeh

Abstract:

This paper examines the behavior of a system, which upon failure is either replaced with certain probability p or imperfectly repaired with probability q. The system is analyzed using Kolmogorov's forward equations method; the analytical expression for the steady state availability is derived as an indicator of the system’s performance. It is found that the analysis becomes more complex as the number of imperfect repairs increases. It is also observed that the availability increases as the number of states and replacement probability increases. Using such an approach in more complex configurations and in dynamic systems is cumbersome; therefore, it is advisable to resort to simulation or heuristics. In this paper, an example is provided for demonstration.

Keywords: repairable models, imperfect, availability, exponential distribution

Procedia PDF Downloads 288
1332 Informality, Trade Facilitation, and Trade: Evidence from Guinea-Bissau

Authors: Julio Vicente Cateia

Abstract:

This paper aims to assess the role of informality and trade facilitation on the export probability of Guinea-Bissau. We include informality in the Féchet function, which gives the expression for the country's supply probability. We find that Guinea-Bissau is about 7.2% less likely to export due to the 1% increase in informality. The export's probability increases by about 1.7%, 4%, and 1.1% due to a 1% increase in trade facilitation, R&D stock, and year of education. These results are significant at the usual levels. We suggest a development agenda aimed at reducing the level of informality in this country.

Keywords: development, trade, informality, trade facilitation, economy of Guinea-Bissau

Procedia PDF Downloads 174
1331 Conservativeness of Probabilistic Constrained Optimal Control Method for Unknown Probability Distribution

Authors: Tomoaki Hashimoto

Abstract:

In recent decades, probabilistic constrained optimal control problems have attracted much attention in many research field. Although probabilistic constraints are generally intractable in an optimization problem, several tractable methods haven been proposed to handle probabilistic constraints. In most methods, probabilistic constraints are reduced to deterministic constraints that are tractable in an optimization problem. However, there is a gap between the transformed deterministic constraints in case of known and unknown probability distribution. This paper examines the conservativeness of probabilistic constrained optimization method with the unknown probability distribution. The objective of this paper is to provide a quantitative assessment of the conservatism for tractable constraints in probabilistic constrained optimization with the unknown probability distribution.

Keywords: optimal control, stochastic systems, discrete time systems, probabilistic constraints

Procedia PDF Downloads 581
1330 Reliability-Based Method for Assessing Liquefaction Potential of Soils

Authors: Mehran Naghizaderokni, Asscar Janalizadechobbasty

Abstract:

This paper explores probabilistic method for assessing the liquefaction potential of sandy soils. The current simplified methods for assessing soil liquefaction potential use a deterministic safety factor in order to determine whether liquefaction will occur or not. However, these methods are unable to determine the liquefaction probability related to a safety factor. A solution to this problem can be found by reliability analysis.This paper presents a reliability analysis method based on the popular certain liquefaction analysis method. The proposed probabilistic method is formulated based on the results of reliability analyses of 190 field records and observations of soil performance against liquefaction. The results of the present study show that confidence coefficient greater and smaller than 1 does not mean safety and/or liquefaction in cadence for liquefaction, and for assuring liquefaction probability, reliability based method analysis should be used. This reliability method uses the empirical acceleration attenuation law in the Chalos area to derive the probability density distribution function and the statistics for the earthquake-induced cyclic shear stress ratio (CSR). The CSR and CRR statistics are used in continuity with the first order and second moment method to calculate the relation between the liquefaction probability, the safety factor and the reliability index. Based on the proposed method, the liquefaction probability related to a safety factor can be easily calculated. The influence of some of the soil parameters on the liquefaction probability can be quantitatively evaluated.

Keywords: liquefaction, reliability analysis, chalos area, civil and structural engineering

Procedia PDF Downloads 470
1329 Quality of the Ruin Probabilities Approximation Using the Regenerative Processes Approach regarding to Large Claims

Authors: Safia Hocine, Djamil Aïssani

Abstract:

Risk models, recently studied in the literature, are becoming increasingly complex. It is rare to find explicit analytical relations to calculate the ruin probability. Indeed, the stability issue occurs naturally in ruin theory, when parameters in risk cannot be estimated than with uncertainty. However, in most cases, there are no explicit formulas for the ruin probability. Hence, the interest to obtain explicit stability bounds for these probabilities in different risk models. In this paper, we interest to the stability bounds of the univariate classical risk model established using the regenerative processes approach. By adopting an algorithmic approach, we implement this approximation and determine numerically the bounds of ruin probability in the case of large claims (heavy-tailed distribution).

Keywords: heavy-tailed distribution, large claims, regenerative process, risk model, ruin probability, stability

Procedia PDF Downloads 365
1328 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 90
1327 Evaluation of Best-Fit Probability Distribution for Prediction of Extreme Hydrologic Phenomena

Authors: Karim Hamidi Machekposhti, Hossein Sedghi

Abstract:

The probability distributions are the best method for forecasting of extreme hydrologic phenomena such as rainfall and flood flows. In this research, in order to determine suitable probability distribution for estimating of annual extreme rainfall and flood flows (discharge) series with different return periods, precipitation with 40 and discharge with 58 years time period had been collected from Karkheh River at Iran. After homogeneity and adequacy tests, data have been analyzed by Stormwater Management and Design Aid (SMADA) software and residual sum of squares (R.S.S). The best probability distribution was Log Pearson Type III with R.S.S value (145.91) and value (13.67) for peak discharge and Log Pearson Type III with R.S.S values (141.08) and (8.95) for maximum discharge in Jelogir Majin and Pole Zal stations, respectively. The best distribution for maximum precipitation in Jelogir Majin and Pole Zal stations was Log Pearson Type III distribution with R.S.S values (1.74&1.90) and then Pearson Type III distribution with R.S.S values (1.53&1.69). Overall, the Log Pearson Type III distributions are acceptable distribution types for representing statistics of extreme hydrologic phenomena in Karkheh River at Iran with the Pearson Type III distribution as a potential alternative.

Keywords: Karkheh River, Log Pearson Type III, probability distribution, residual sum of squares

Procedia PDF Downloads 197
1326 Development of Probability Distribution Models for Degree of Bending (DoB) in Chord Member of Tubular X-Joints under Bending Loads

Authors: Hamid Ahmadi, Amirreza Ghaffari

Abstract:

Fatigue life of tubular joints in offshore structures is not only dependent on the value of hot-spot stress, but is also significantly influenced by the through-the-thickness stress distribution characterized by the degree of bending (DoB). The DoB exhibits considerable scatter calling for greater emphasis in accurate determination of its governing probability distribution which is a key input for the fatigue reliability analysis of a tubular joint. Although the tubular X-joints are commonly found in offshore jacket structures, as far as the authors are aware, no comprehensive research has been carried out on the probability distribution of the DoB in tubular X-joints. What has been used so far as the probability distribution of the DoB in reliability analyses is mainly based on assumptions and limited observations, especially in terms of distribution parameters. In the present paper, results of parametric equations available for the calculation of the DoB have been used to develop probability distribution models for the DoB in the chord member of tubular X-joints subjected to four types of bending loads. Based on a parametric study, a set of samples was prepared and density histograms were generated for these samples using Freedman-Diaconis method. Twelve different probability density functions (PDFs) were fitted to these histograms. The maximum likelihood method was utilized to determine the parameters of fitted distributions. In each case, Kolmogorov-Smirnov test was used to evaluate the goodness of fit. Finally, after substituting the values of estimated parameters for each distribution, a set of fully defined PDFs have been proposed for the DoB in tubular X-joints subjected to bending loads.

Keywords: tubular X-joint, degree of bending (DoB), probability density function (PDF), Kolmogorov-Smirnov goodness-of-fit test

Procedia PDF Downloads 719
1325 Determining Best Fitting Distributions for Minimum Flows of Streams in Gediz Basin

Authors: Naci Büyükkaracığan

Abstract:

Today, the need for water sources is swiftly increasing due to population growth. At the same time, it is known that some regions will face with shortage of water and drought because of the global warming and climate change. In this context, evaluation and analysis of hydrological data such as the observed trends, drought and flood prediction of short term flow has great deal of importance. The most accurate selection probability distribution is important to describe the low flow statistics for the studies related to drought analysis. As in many basins In Turkey, Gediz River basin will be affected enough by the drought and will decrease the amount of used water. The aim of this study is to derive appropriate probability distributions for frequency analysis of annual minimum flows at 6 gauging stations of the Gediz Basin. After applying 10 different probability distributions, six different parameter estimation methods and 3 fitness test, the Pearson 3 distribution and general extreme values distributions were found to give optimal results.

Keywords: Gediz Basin, goodness-of-fit tests, minimum flows, probability distribution

Procedia PDF Downloads 271
1324 Nonlinear Analysis with Failure Using the Boundary Element Method

Authors: Ernesto Pineda Leon, Dante Tolentino Lopez, Janis Zapata Lopez

Abstract:

The current paper shows the application of the boundary element method for the analysis of plates under shear stress causing plasticity. In this case, the shear deformation of a plate is considered by means of the Reissner’s theory. The probability of failure of a Reissner’s plate due to a proposed index plastic behavior is calculated taken into account the uncertainty in mechanical and geometrical properties. The problem is developed in two dimensions. The classic plasticity’s theory is applied and a formulation for initial stresses that lead to the boundary integral equations due to plasticity is also used. For the plasticity calculation, the Von Misses criteria is used. To solve the non-linear equations an incremental method is employed. The results show a relatively small failure probability for the ranges of loads between 0.6 and 1.0. However, for values between 1.0 and 2.5, the probability of failure increases significantly. Consequently, for load bigger than 2.5 the plate failure is a safe event. The results are compared to those that were found in the literature and the agreement is good.

Keywords: boundary element method, failure, plasticity, probability

Procedia PDF Downloads 312
1323 Evaluation of Expected Annual Loss Probabilities of RC Moment Resisting Frames

Authors: Saemee Jun, Dong-Hyeon Shin, Tae-Sang Ahn, Hyung-Joon Kim

Abstract:

Building loss estimation methodologies which have been advanced considerably in recent decades are usually used to estimate socio and economic impacts resulting from seismic structural damage. In accordance with these methods, this paper presents the evaluation of an annual loss probability of a reinforced concrete moment resisting frame designed according to Korean Building Code. The annual loss probability is defined by (1) a fragility curve obtained from a capacity spectrum method which is similar to a method adopted from HAZUS, and (2) a seismic hazard curve derived from annual frequencies of exceedance per peak ground acceleration. Seismic fragilities are computed to calculate the annual loss probability of a certain structure using functions depending on structural capacity, seismic demand, structural response and the probability of exceeding damage state thresholds. This study carried out a nonlinear static analysis to obtain the capacity of a RC moment resisting frame selected as a prototype building. The analysis results show that the probability of being extensive structural damage in the prototype building is expected to 0.004% in a year.

Keywords: expected annual loss, loss estimation, RC structure, fragility analysis

Procedia PDF Downloads 398
1322 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model

Authors: Chaudhuri Manoj Kumar Swain, Susmita Das

Abstract:

This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.

Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis

Procedia PDF Downloads 179
1321 The Theory behind Logistic Regression

Authors: Jan Henrik Wosnitza

Abstract:

The logistic regression has developed into a standard approach for estimating conditional probabilities in a wide range of applications including credit risk prediction. The article at hand contributes to the current literature on logistic regression fourfold: First, it is demonstrated that the binary logistic regression automatically meets its model assumptions under very general conditions. This result explains, at least in part, the logistic regression's popularity. Second, the requirement of homoscedasticity in the context of binary logistic regression is theoretically substantiated. The variances among the groups of defaulted and non-defaulted obligors have to be the same across the level of the aggregated default indicators in order to achieve linear logits. Third, this article sheds some light on the question why nonlinear logits might be superior to linear logits in case of a small amount of data. Fourth, an innovative methodology for estimating correlations between obligor-specific log-odds is proposed. In order to crystallize the key ideas, this paper focuses on the example of credit risk prediction. However, the results presented in this paper can easily be transferred to any other field of application.

Keywords: correlation, credit risk estimation, default correlation, homoscedasticity, logistic regression, nonlinear logistic regression

Procedia PDF Downloads 427
1320 Solving LWE by Pregressive Pumps and Its Optimization

Authors: Leizhang Wang, Baocang Wang

Abstract:

General Sieve Kernel (G6K) is considered as currently the fastest algorithm for the shortest vector problem (SVP) and record holder of open SVP challenge. We study the lattice basis quality improvement effects of the Workout proposed in G6K, which is composed of a series of pumps to solve SVP. Firstly, we use a low-dimensional pump output basis to propose a predictor to predict the quality of high-dimensional Pumps output basis. Both theoretical analysis and experimental tests are performed to illustrate that it is more computationally expensive to solve the LWE problems by using a G6K default SVP solving strategy (Workout) than these lattice reduction algorithms (e.g. BKZ 2.0, Progressive BKZ, Pump, and Jump BKZ) with sieving as their SVP oracle. Secondly, the default Workout in G6K is optimized to achieve a stronger reduction and lower computational cost. Thirdly, we combine the optimized Workout and the Pump output basis quality predictor to further reduce the computational cost by optimizing LWE instances selection strategy. In fact, we can solve the TU LWE challenge (n = 65, q = 4225, = 0:005) 13.6 times faster than the G6K default Workout. Fourthly, we consider a combined two-stage (Preprocessing by BKZ- and a big Pump) LWE solving strategy. Both stages use dimension for free technology to give new theoretical security estimations of several LWE-based cryptographic schemes. The security estimations show that the securities of these schemes with the conservative Newhope’s core-SVP model are somewhat overestimated. In addition, in the case of LAC scheme, LWE instances selection strategy can be optimized to further improve the LWE-solving efficiency even by 15% and 57%. Finally, some experiments are implemented to examine the effects of our strategies on the Normal Form LWE problems, and the results demonstrate that the combined strategy is four times faster than that of Newhope.

Keywords: LWE, G6K, pump estimator, LWE instances selection strategy, dimension for free

Procedia PDF Downloads 60
1319 A Hyperexponential Approximation to Finite-Time and Infinite-Time Ruin Probabilities of Compound Poisson Processes

Authors: Amir T. Payandeh Najafabadi

Abstract:

This article considers the problem of evaluating infinite-time (or finite-time) ruin probability under a given compound Poisson surplus process by approximating the claim size distribution by a finite mixture exponential, say Hyperexponential, distribution. It restates the infinite-time (or finite-time) ruin probability as a solvable ordinary differential equation (or a partial differential equation). Application of our findings has been given through a simulation study.

Keywords: ruin probability, compound poisson processes, mixture exponential (hyperexponential) distribution, heavy-tailed distributions

Procedia PDF Downloads 341
1318 Using Indigenous Games to Demystify Probability Theorem in Ghanaian Classrooms: Mathematical Analysis of Ampe

Authors: Peter Akayuure, Michael Johnson Nabie

Abstract:

Similar to many colonized nations in the world, one indelible mark left by colonial masters after Ghana’s independence in 1957 has been the fact that many contexts used to teach statistics and probability concepts are often alien and do not resonate with the social domain of our indigenous Ghanaian child. This has seriously limited the understanding, discoveries, and applications of mathematics for national developments. With the recent curriculum demands of making the Ghanaian child mathematically literate, this qualitative study involved video recordings and mathematical analysis of play sessions of an indigenous girl game called Ampe with the aim to demystify the concepts in probability theorem, which is applied in mathematics related fields of study. The mathematical analysis shows that the game of Ampe, which is widely played by school girls in Ghana, is suitable for learning concepts of the probability theorems. It was also revealed that as a girl game, the use of Ampe provides good lessons to educators, textbook writers, and teachers to rethink about the selection of mathematics tasks and learning contexts that are sensitive to gender. As we undertake to transform teacher education and student learning, the use of indigenous games should be critically revisited.

Keywords: Ampe, mathematical analysis, probability theorem, Ghanaian girl game

Procedia PDF Downloads 372