Search results for: Keyword's probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 585

Search results for: Keyword's probability

525 Modeling of Random Variable with Digital Probability Hyper Digraph: Data-Oriented Approach

Authors: A. Habibizad Navin, M. Naghian Fesharaki, M. Mirnia, M. Kargar

Abstract:

In this paper we introduce Digital Probability Hyper Digraph for modeling random variable as the hierarchical data-oriented model.

Keywords: Data-Oriented Models, Data Structure, DigitalProbability Hyper Digraph, Random Variable, Statistic andProbability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1271
524 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs

Authors: Lokesh Varshney, R. K. Saket

Abstract:

This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operated as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machines operated as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.

Keywords: Residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2126
523 Determination of Severe Loading Condition at Critical System Cascading Collapse Considering the Effect of Protection System Hidden Failure

Authors: N. A. Salim, M. M. Othman, I. Musirin, M. S. Serwan

Abstract:

Hidden failure in a protection system has been recognized as one of the main reasons which may cause to a power system instability leading to a system cascading collapse. This paper presents a computationally systematic approach used to obtain the estimated average probability of a system cascading collapse by considering the effect of probability hidden failure in a protection system. The estimated average probability of a system cascading collapse is then used to determine the severe loading condition contributing to the higher risk of critical system cascading collapse. This information is essential to the system utility since it will assist the operator to determine the highest point of increased system loading condition prior to the event of critical system cascading collapse.

Keywords: Critical system cascading collapse, protection system hidden failure, severe loading condition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1493
522 Kernel Matching versus Inverse Probability Weighting: A Comparative Study

Authors: Andy Handouyahia, Tony Haddad, Frank Eaton

Abstract:

Recent quasi-experimental evaluation of the Canadian Active Labour Market Policies (ALMP) by Human Resources and Skills Development Canada (HRSDC) has provided an opportunity to examine alternative methods to estimating the incremental effects of Employment Benefits and Support Measures (EBSMs) on program participants. The focus of this paper is to assess the efficiency and robustness of inverse probability weighting (IPW) relative to kernel matching (KM) in the estimation of program effects. To accomplish this objective, the authors compare pairs of 1,080 estimates, along with their associated standard errors, to assess which type of estimate is generally more efficient and robust. In the interest of practicality, the authorsalso document the computationaltime it took to produce the IPW and KM estimates, respectively.

Keywords: Treatment effect, causal inference, observational studies, Propensity score based matching, Kernel Matching, Inverse Probability Weighting, Estimation methods for incremental effect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6925
521 Reliability Analysis of Underground Pipelines Using Subset Simulation

Authors: Kong Fah Tee, Lutfor Rahman Khan, Hongshuang Li

Abstract:

An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.

Keywords: Underground pipelines, Probability of failure, Reliability and Subset Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3554
520 Ruin Probabilities with Dependent Rates of Interest and Autoregressive Moving Average Structures

Authors: Fenglong Guo, Dingcheng Wang

Abstract:

This paper studies ruin probabilities in two discrete-time risk models with premiums, claims and rates of interest modelled by three autoregressive moving average processes. Generalized Lundberg inequalities for ruin probabilities are derived by using recursive technique. A numerical example is given to illustrate the applications of these probability inequalities.

Keywords: Lundberg inequality, NWUC, Renewal recursive technique, Ruin probability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1525
519 Wind Fragility for Honeycomb Roof Cladding Panels Using Screw Pull-Out Capacity

Authors: Viriyavudh Sim, Woo Young Jung

Abstract:

The failure of roof cladding mostly occurs due to the failing of the connection between claddings and purlins, which is the pull-out of the screw connecting the two parts when the pull-out load, i.e. typhoon, is higher than the resistance of the connection screw. As typhoon disasters in Korea are constantly on the rise, probability risk assessment (PRA) has become a vital tool to evaluate the performance of civil structures. In this study, we attempted to determine the fragility of roof cladding with the screw connection. Experimental study was performed to evaluate the pull-out resistance of screw joints between honeycomb panels and back frames. Subsequently, by means of Monte Carlo Simulation method, probability of failure for these types of roof cladding was determined. The results that the failure of roof cladding was depends on their location on the roof, for example, the edge most panel has the highest probability of failure.

Keywords: Monte Carlo Simulation, roof cladding, screw pull-out strength, wind fragility

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 955
518 Improved Segmentation of Speckled Images Using an Arithmetic-to-Geometric Mean Ratio Kernel

Authors: J. Daba, J. Dubois

Abstract:

In this work, we improve a previously developed segmentation scheme aimed at extracting edge information from speckled images using a maximum likelihood edge detector. The scheme was based on finding a threshold for the probability density function of a new kernel defined as the arithmetic mean-to-geometric mean ratio field over a circular neighborhood set and, in a general context, is founded on a likelihood random field model (LRFM). The segmentation algorithm was applied to discriminated speckle areas obtained using simple elliptic discriminant functions based on measures of the signal-to-noise ratio with fractional order moments. A rigorous stochastic analysis was used to derive an exact expression for the cumulative density function of the probability density function of the random field. Based on this, an accurate probability of error was derived and the performance of the scheme was analysed. The improved segmentation scheme performed well for both simulated and real images and showed superior results to those previously obtained using the original LRFM scheme and standard edge detection methods. In particular, the false alarm probability was markedly lower than that of the original LRFM method with oversegmentation artifacts virtually eliminated. The importance of this work lies in the development of a stochastic-based segmentation, allowing an accurate quantification of the probability of false detection. Non visual quantification and misclassification in medical ultrasound speckled images is relatively new and is of interest to clinicians.

Keywords: Discriminant function, false alarm, segmentation, signal-to-noise ratio, skewness, speckle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
517 Influence of Maximum Fatigue Load on Probabilistic Aspect of Fatigue Crack Propagation Life at Specified Grown Crack in Magnesium Alloys

Authors: Seon Soon Choi

Abstract:

The principal purpose of this paper is to find the influence of maximum fatigue load on the probabilistic aspect of fatigue crack propagation life at a specified grown crack in magnesium alloys. The experiments of fatigue crack propagation are carried out in laboratory air under different conditions of the maximum fatigue loads to obtain the fatigue crack propagation data for the statistical analysis. In order to analyze the probabilistic aspect of fatigue crack propagation life, the goodness-of fit test for probability distribution of the fatigue crack propagation life at a specified grown crack is implemented through Anderson-Darling test. The good probability distribution of the fatigue crack propagation life is also verified under the conditions of the maximum fatigue loads.

Keywords: Fatigue crack propagation life, magnesium alloys, maximum fatigue load, probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 974
516 Evolutionary Dynamics on Small-World Networks

Authors: Jan Rychtar, Brian Stadler

Abstract:

We study how the outcome of evolutionary dynamics on graphs depends on a randomness on the graph structure. We gradually change the underlying graph from completely regular (e.g. a square lattice) to completely random. We find that the fixation probability increases as the randomness increases; nevertheless, the increase is not significant and thus the fixation probability could be estimated by the known formulas for underlying regular graphs.

Keywords: evolutionary dynamics, small-world networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1236
515 A Simplified Distribution for Nonlinear Seas

Authors: M. A. Tayfun, M. A. Alkhalidi

Abstract:

The exact theoretical expression describing the probability distribution of nonlinear sea-surface elevations derived from the second-order narrowband model has a cumbersome form that requires numerical computations, not well-disposed to theoretical or practical applications. Here, the same narrowband model is reexamined to develop a simpler closed-form approximation suitable for theoretical and practical applications. The salient features of the approximate form are explored, and its relative validity is verified with comparisons to other readily available approximations, and oceanic data.

Keywords: Ocean waves, probability distributions, second-order nonlinearities, skewness coefficient, wave steepness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2096
514 Quantifying and Adjusting the Effects of Publication Bias in Continuous Meta-Analysis

Authors: N.R.N. Idris

Abstract:

This study uses simulated meta-analysis to assess the effects of publication bias on meta-analysis estimates and to evaluate the efficacy of the trim and fill method in adjusting for these biases. The estimated effect sizes and the standard error were evaluated in terms of the statistical bias and the coverage probability. The results demonstrate that if publication bias is not adjusted it could lead to up to 40% bias in the treatment effect estimates. Utilization of the trim and fill method could reduce the bias in the overall estimate by more than half. The method is optimum in presence of moderate underlying bias but has minimal effects in presence of low and severe publication bias. Additionally, the trim and fill method improves the coverage probability by more than half when subjected to the same level of publication bias as those of the unadjusted data. The method however tends to produce false positive results and will incorrectly adjust the data for publication bias up to 45 % of the time. Nonetheless, the bias introduced into the estimates due to this adjustment is minimal

Keywords: Publication bias, Trim and Fill method, percentage relative bias, coverage probability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
513 Direct Sequence Spread Spectrum Technique with Residue Number System

Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany

Abstract:

In this paper, a residue number arithmetic is used in direct sequence spread spectrum system, this system is evaluated and the bit error probability of this system is compared to that of non residue number system. The effect of channel bandwidth, PN sequences, multipath effect and modulation scheme are studied. A Matlab program is developed to measure the signal-to-noise ratio (SNR), and the bit error probability for the various schemes.

Keywords: Spread Spectrum, Direct sequence, Bit errorprobability and Residue number system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3648
512 Data Oriented Modeling of Uniform Random Variable: Applied Approach

Authors: Ahmad Habibizad Navin, Mehdi Naghian Fesharaki, Mirkamal Mirnia, Mohamad Teshnelab, Ehsan Shahamatnia

Abstract:

In this paper we introduce new data oriented modeling of uniform random variable well-matched with computing systems. Due to this conformity with current computers structure, this modeling will be efficiently used in statistical inference.

Keywords: Uniform random variable, Data oriented modeling, Statistical inference, Prodigraph, Statistically complete tree, Uniformdigital probability digraph, Uniform n-complete probability tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630
511 Optimization for Reducing Handoff Latency and Utilization of Bandwidth in ATM Networks

Authors: Pooja, Megha Kulshrestha, V. K. Banga, Parvinder S. Sandhu

Abstract:

To support mobility in ATM networks, a number of technical challenges need to be resolved. The impact of handoff schemes in terms of service disruption, handoff latency, cost implications and excess resources required during handoffs needs to be addressed. In this paper, a one phase handoff and route optimization solution using reserved PVCs between adjacent ATM switches to reroute connections during inter-switch handoff is studied. In the second phase, a distributed optimization process is initiated to optimally reroute handoff connections. The main objective is to find the optimal operating point at which to perform optimization subject to cost constraint with the purpose of reducing blocking probability of inter-switch handoff calls for delay tolerant traffic. We examine the relation between the required bandwidth resources and optimization rate. Also we calculate and study the handoff blocking probability due to lack of bandwidth for resources reserved to facilitate the rapid rerouting.

Keywords: Wireless ATM, Mobility, Latency, Optimization rateand Blocking Probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
510 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy

Authors: Zviad Ghadua, Biswa Bhattacharya

Abstract:

The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.

Keywords: Flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 748
509 Stochastic Risk Analysis Framework for Building Construction Projects

Authors: Abdulkadir Abu Lawal

Abstract:

The study was carried out to establish the probability density function of some selected building construction projects of similar complexity delivered using Bill of Quantities (BQ) and Lump Sum (LS) forms of contract, and to draw a reliability scenario for each form of contract. 30 of such delivered projects are analyzed for each of the contract forms using Weibull Analysis, and their Weibull functions (α, and β) are determined based on their completion times. For the BQ form of contract delivered projects, α is calculated as 1.6737E20 and β as + 0.0115 and for the LS form, α is found to be 5.6556E03 and β is determined as + 0.4535. Using these values, respective probability density functions are calculated and plotted, as handy tool for risk analysis of future projects of similar characteristics. By input of variables from other projects, decision making processes can be made for a whole project or its components using EVM Analysis in project evaluation and review techniques. This framework, as a quantitative approach, depends on the assumption of normality in projects completion time, it can help greatly in determining the completion time probability for veritable projects using any of the contract forms under consideration. Projects aspects that are not amenable to measurement, on the other hand, can be analyzed using fuzzy sets and fuzzy logic. This scenario can be drawn for different types of building construction projects, and using different suitable forms of contract in projects delivery.

Keywords: Building construction, Projects, Forms of contract, Probability density function, Reliability scenario.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 781
508 Performance of Soft Handover Algorithm in Varied Propagation Environments

Authors: N. P. Singh, Brahmjit Singh

Abstract:

CDMA cellular networks support soft handover, which guarantees the continuity of wireless services and enhanced communication quality. Cellular networks support multimedia services under varied propagation environmental conditions. In this paper, we have shown the effect of characteristic parameters of the cellular environments on the soft handover performance. We consider path loss exponent, standard deviation of shadow fading and correlation coefficient of shadow fading as the characteristic parameters of the radio propagation environment. A very useful statistical measure for characterizing the performance of mobile radio system is the probability of outage. It is shown through numerical results that above parameters have decisive effect on the probability of outage and hence the overall performance of the soft handover algorithm.

Keywords: CDMA, Correlation coefficient, Path loss exponent, Probability of outage, Soft handover.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
507 Parametric Modeling Approach for Call Holding Times for IP based Public Safety Networks via EM Algorithm

Authors: Badarch Tuyatsetseg

Abstract:

This paper presents parametric probability density models for call holding times (CHTs) into emergency call center based on the actual data collected for over a week in the public Emergency Information Network (EIN) in Mongolia. When the set of chosen candidates of Gamma distribution family is fitted to the call holding time data, it is observed that the whole area in the CHT empirical histogram is underestimated due to spikes of higher probability and long tails of lower probability in the histogram. Therefore, we provide the Gaussian parametric model of a mixture of lognormal distributions with explicit analytical expressions for the modeling of CHTs of PSNs. Finally, we show that the CHTs for PSNs are fitted reasonably by a mixture of lognormal distributions via the simulation of expectation maximization algorithm. This result is significant as it expresses a useful mathematical tool in an explicit manner of a mixture of lognormal distributions.

Keywords: A mixture of lognormal distributions, modeling call holding times, public safety network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1650
506 Cognitive Relaying in Interference Limited Spectrum Sharing Environment: Outage Probability and Outage Capacity

Authors: Md Fazlul Kader, Soo Young Shin

Abstract:

In this paper, we consider a cognitive relay network (CRN) in which the primary receiver (PR) is protected by peak transmit power ¯PST and/or peak interference power Q constraints. In addition, the interference effect from the primary transmitter (PT) is considered to show its impact on the performance of the CRN. We investigate the outage probability (OP) and outage capacity (OC) of the CRN by deriving closed-form expressions over Rayleigh fading channel. Results show that both the OP and OC improve by increasing the cooperative relay nodes as well as when the PT is far away from the SR.

Keywords: Cognitive relay, outage, interference limited, decode-and-forward (DF).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908
505 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis

Authors: Petr Gurný

Abstract:

One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the creditscoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.

Keywords: Credit-scoring Models, Multidimensional Subordinated Lévy Model, Probability of Default.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1919
504 Probability Distribution of Rainfall Depth at Hourly Time-Scale

Authors: S. Dan'azumi, S. Shamsudin, A. A. Rahman

Abstract:

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.

Keywords: Goodness-of-fit test, Hourly rainfall, Malaysia, Probability distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2920
503 Establishing a Probabilistic Model of Extrapolated Wind Speed Data for Wind Energy Prediction

Authors: Mussa I. Mgwatu, Reuben R. M. Kainkwa

Abstract:

Wind is among the potential energy resources which can be harnessed to generate wind energy for conversion into electrical power. Due to the variability of wind speed with time and height, it becomes difficult to predict the generated wind energy more optimally. In this paper, an attempt is made to establish a probabilistic model fitting the wind speed data recorded at Makambako site in Tanzania. Wind speeds and direction were respectively measured using anemometer (type AN1) and wind Vane (type WD1) both supplied by Delta-T-Devices at a measurement height of 2 m. Wind speeds were then extrapolated for the height of 10 m using power law equation with an exponent of 0.47. Data were analysed using MINITAB statistical software to show the variability of wind speeds with time and height, and to determine the underlying probability model of the extrapolated wind speed data. The results show that wind speeds at Makambako site vary cyclically over time; and they conform to the Weibull probability distribution. From these results, Weibull probability density function can be used to predict the wind energy.

Keywords: Probabilistic models, wind speed, wind energy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2346
502 The Research and Application of M/M/1/N Queuing Model with Variable Input Rates, Variable Service Rates and Impatient Customers

Authors: Quanru Pan

Abstract:

How to maintain the service speeds for the business to make the biggest profit is a problem worthy of study, which is discussed in this paper with the use of queuing theory. An M/M/1/N queuing model with variable input rates, variable service rates and impatient customers is established, and the following conclusions are drawn: the stationary distribution of the model, the relationship between the stationary distribution and the probability that there are n customers left in the system when a customer leaves (not including the customer who leaves himself), the busy period of the system, the average operating cycle, the loss probability for the customers not entering the system while they arriving at the system, the mean of the customers who leaves the system being for impatient, the loss probability for the customers not joining the queue due to the limited capacity of the system and many other indicators. This paper also indicates that the following conclusion is not correct: the more customers the business serve, the more profit they will get. At last, this paper points out the appropriate service speeds the business should keep to make the biggest profit.

Keywords: variable input rates, impatient customer, variable servicerates, profit maximization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1963
501 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.

Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3311
500 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error

Authors: Oscar Javier Herrera, Manuel Ángel Camacho

Abstract:

This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors.’ The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.

Keywords: Demand Forecasting, Empirical Distribution, Propagation of Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
499 An Extension of the Kratzel Function and Associated Inverse Gaussian Probability Distribution Occurring in Reliability Theory

Authors: R. K. Saxena, Ravi Saxena

Abstract:

In view of their importance and usefulness in reliability theory and probability distributions, several generalizations of the inverse Gaussian distribution and the Krtzel function are investigated in recent years. This has motivated the authors to introduce and study a new generalization of the inverse Gaussian distribution and the Krtzel function associated with a product of a Bessel function of the third kind )(zKQ and a Z - Fox-Wright generalized hyper geometric function introduced in this paper. The introduced function turns out to be a unified gamma-type function. Its incomplete forms are also discussed. Several properties of this gamma-type function are obtained. By means of this generalized function, we introduce a generalization of inverse Gaussian distribution, which is useful in reliability analysis, diffusion processes, and radio techniques etc. The inverse Gaussian distribution thus introduced also provides a generalization of the Krtzel function. Some basic statistical functions associated with this probability density function, such as moments, the Mellin transform, the moment generating function, the hazard rate function, and the mean residue life function are also obtained.KeywordsFox-Wright function, Inverse Gaussian distribution, Krtzel function & Bessel function of the third kind.

Keywords: Fox-Wright function, Inverse Gaussian distribution, Krtzel function & Bessel function of the third kind.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
498 Investigation and Calculation of Seismic Reliability of Structures

Authors: Panam. Zarfam, Mohsen. Javan Pour

Abstract:

Recently, analysis and designing of the structures based on the Reliability theory have been the center of attention. Reason of this attention is the existence of the natural and random structural parameters such as the material specification, external loads, geometric dimensions etc. By means of the Reliability theory, uncertainties resulted from the statistical nature of the structural parameters can be changed into the mathematical equations and the safety and operational considerations can be considered in the designing process. According to this theory, it is possible to study the destruction probability of not only a specific element but also the entire system. Therefore, after being assured of safety of every element, their reciprocal effects on the safety of the entire system can be investigated.

Keywords: Probability, Reliability, Statistics, Uncertainty

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1597
497 Confidence Intervals for the Normal Mean with Known Coefficient of Variation

Authors: Suparat Niwitpong

Abstract:

In this paper we proposed two new confidence intervals for the normal population mean with known coefficient of variation. This situation occurs normally in environment and agriculture experiments where the scientist knows the coefficient of variation of their experiments. We propose two new confidence intervals for this problem based on the recent work of Searls [5] and the new method proposed in this paper for the first time. We derive analytic expressions for the coverage probability and the expected length of each confidence interval. Monte Carlo simulation will be used to assess the performance of these intervals based on their expected lengths.

Keywords: confidence interval, coverage probability, expected length, known coefficient of variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651
496 Performance Verification of Seismic Design Codes for RC Frames

Authors: Payam Asadi, Ali Bakhshi

Abstract:

In this study, a frame work for verification of famous seismic codes is utilized. To verify the seismic codes performance, damage quantity of RC frames is compared with the target performance. Due to the randomness property of seismic design and earthquake loads excitation, in this paper, fragility curves are developed. These diagrams are utilized to evaluate performance level of structures which are designed by the seismic codes. These diagrams further illustrate the effect of load combination and reduction factors of codes on probability of damage exceedance. Two types of structures; very high important structures with high ductility and medium important structures with intermediate ductility are designed by different seismic codes. The Results reveal that usually lower damage ratio generate lower probability of exceedance. In addition, the findings indicate that there are buildings with higher quantity of bars which they have higher probability of damage exceedance. Life-cycle cost analysis utilized for comparison and final decision making process.

Keywords: RC frame, fragility curve, performance-base design, life-cycle cost analyses, seismic design codes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937