Search results for: Weibull bi-parameter probability function
6004 Optimization of Reliability Test Plans: Increase Wafer Fabrication Equipments Uptime
Authors: Swajeeth Panchangam, Arun Rajendran, Swarnim Gupta, Ahmed Zeouita
Abstract:
Semiconductor processing chambers tend to operate in controlled but aggressive operating conditions (chemistry, plasma, high temperature etc.) Owing to this, the design of this equipment requires developing robust and reliable hardware and software. Any equipment downtime due to reliability issues can have cost implications both for customers in terms of tool downtime (reduced throughput) and for equipment manufacturers in terms of high warranty costs and customer trust deficit. A thorough reliability assessment of critical parts and a plan for preventive maintenance/replacement schedules need to be done before tool shipment. This helps to save significant warranty costs and tool downtimes in the field. However, designing a proper reliability test plan to accurately demonstrate reliability targets with proper sample size and test duration is quite challenging. This is mainly because components can fail in different failure modes that fit into different Weibull beta value distributions. Without apriori Weibull beta of a failure mode under consideration, it always leads to over/under utilization of resources, which eventually end up in false positives or false negatives estimates. This paper proposes a methodology to design a reliability test plan with optimal model size/duration/both (independent of apriori Weibull beta). This methodology can be used in demonstration tests and can be extended to accelerated life tests to further decrease sample size/test duration.Keywords: reliability, stochastics, preventive maintenance
Procedia PDF Downloads 156003 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions
Authors: Valerii Dashuk
Abstract:
The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function
Procedia PDF Downloads 1746002 Supplier Selection and Order Allocation Using a Stochastic Multi-Objective Programming Model and Genetic Algorithm
Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh
Abstract:
In this paper, we develop a supplier selection and order allocation multi-objective model in stochastic environment in which purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. To do so, we use dependent chance programming (DCP) that maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. After transforming the above mentioned stochastic multi-objective programming problem into a stochastic single objective problem using minimum deviation method, we apply a genetic algorithm to get the later single objective problem solved. The employed genetic algorithm performs a simulation process in order to calculate the stochastic objective function as its fitness function. At the end, we explore the impact of stochastic parameters on the given solution via a sensitivity analysis exploiting coefficient of variation. The results show that as stochastic parameters have greater coefficients of variation, the value of objective function in the stochastic single objective programming problem is worsened.Keywords: dependent chance programming, genetic algorithm, minimum deviation method, order allocation, supplier selection
Procedia PDF Downloads 2566001 COVID-19 Teaches Probability Risk Assessment
Authors: Sean Sloan
Abstract:
Probability Risk Assessments (PRA) can be a difficult concept for students to grasp. So in searching for different ways to describe PRA to relate it to their lives; COVID-19 came up. The parallels are amazing. Soon students began analyzing acceptable risk with the virus. This helped them to quantify just how dangerous is dangerous. The original lesson was dismissed and for the remainder of the period, the probability of risk, and the lethality of risk became the topic. Spreading events such as a COVID carrier on an airline became analogous to single fault casualties such as a Tsunami. Odds of spreading became odds of backup-diesel-generator failure – like with Fukashima Daiichi. Fatalities of the disease became expected fatalities due to radiation spread. Quantification from this discussion took it from hyperbole and emotion into one where we could rationally base guidelines. It has been one of the most effective educational devices observed.Keywords: COVID, education, probability, risk
Procedia PDF Downloads 1526000 Saliency Detection Using a Background Probability Model
Authors: Junling Li, Fang Meng, Yichun Zhang
Abstract:
Image saliency detection has been long studied, while several challenging problems are still unsolved, such as detecting saliency inaccurately in complex scenes or suppressing salient objects in the image borders. In this paper, we propose a new saliency detection algorithm in order to solving these problems. We represent the image as a graph with superixels as nodes. By considering appearance similarity between the boundary and the background, the proposed method chooses non-saliency boundary nodes as background priors to construct the background probability model. The probability that each node belongs to the model is computed, which measures its similarity with backgrounds. Thus we can calculate saliency by the transformed probability as a metric. We compare our algorithm with ten-state-of-the-art salient detection methods on the public database. Experimental results show that our simple and effective approach can attack those challenging problems that had been baffling in image saliency detection.Keywords: visual saliency, background probability, boundary knowledge, background priors
Procedia PDF Downloads 4295999 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change
Authors: Ali Razmi, Saeed Golian
Abstract:
Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.Keywords: climate change, climate variables, copula, joint probability
Procedia PDF Downloads 3605998 Comparison of Wind Fragility for Window System in the Simplified 10 and 15-Story Building Considering Exposure Category
Authors: Viriyavudh Sim, WooYoung Jung
Abstract:
Window system in high rise building is occasionally subjected to an excessive wind intensity, particularly during typhoon. The failure of window system did not affect overall safety of structural performance; however, it could endanger the safety of the residents. In this paper, comparison of fragility curves for window system of two residential buildings was studied. The probability of failure for individual window was determined with Monte Carlo Simulation method. Then, lognormal cumulative distribution function was used to represent the fragility. The results showed that windows located on the edge of leeward wall were more susceptible to wind load and the probability of failure for each window panel increased at higher floors.Keywords: wind fragility, window system, high rise building, wind disaster
Procedia PDF Downloads 3145997 Sufficient Conditions for Exponential Stability of Stochastic Differential Equations with Non Trivial Solutions
Authors: Fakhreddin Abedi, Wah June Leong
Abstract:
Exponential stability of stochastic differential equations with non trivial solutions is provided in terms of Lyapunov functions. The main result of this paper establishes that, under certain hypotheses for the dynamics f(.) and g(.), practical exponential stability in probability at the small neighborhood of the origin is equivalent to the existence of an appropriate Lyapunov function. Indeed, we establish exponential stability of stochastic differential equation when almost all the state trajectories are bounded and approach a sufficiently small neighborhood of the origin. We derive sufficient conditions for exponential stability of stochastic differential equations. Finally, we give a numerical example illustrating our results.Keywords: exponential stability in probability, stochastic differential equations, Lyapunov technique, Ito's formula
Procedia PDF Downloads 525996 Robust Noisy Speech Identification Using Frame Classifier Derived Features
Authors: Punnoose A. K.
Abstract:
This paper presents an approach for identifying noisy speech recording using a multi-layer perception (MLP) trained to predict phonemes from acoustic features. Characteristics of the MLP posteriors are explored for clean speech and noisy speech at the frame level. Appropriate density functions are used to fit the softmax probability of the clean and noisy speech. A function that takes into account the ratio of the softmax probability density of noisy speech to clean speech is formulated. These phoneme independent scoring is weighted using a phoneme-specific weightage to make the scoring more robust. Simple thresholding is used to identify the noisy speech recording from the clean speech recordings. The approach is benchmarked on standard databases, with a focus on precision.Keywords: noisy speech identification, speech pre-processing, noise robustness, feature engineering
Procedia PDF Downloads 1275995 Optimal Scheduling for Energy Storage System Considering Reliability Constraints
Authors: Wook-Won Kim, Je-Seok Shin, Jin-O Kim
Abstract:
This paper propose the method for optimal scheduling for battery energy storage system with reliability constraint of energy storage system in reliability aspect. The optimal scheduling problem is solved by dynamic programming with proposed transition matrix. Proposed optimal scheduling method guarantees the minimum fuel cost within specific reliability constraint. For evaluating proposed method, the timely capacity outage probability table (COPT) is used that is calculated by convolution of probability mass function of each generator. This study shows the result of optimal schedule of energy storage system.Keywords: energy storage system (ESS), optimal scheduling, dynamic programming, reliability constraints
Procedia PDF Downloads 4065994 The Implementation of Secton Method for Finding the Root of Interpolation Function
Authors: Nur Rokhman
Abstract:
A mathematical function gives relationship between the variables composing the function. Interpolation can be viewed as a process of finding mathematical function which goes through some specified points. There are many interpolation methods, namely: Lagrange method, Newton method, Spline method etc. For some specific condition, such as, big amount of interpolation points, the interpolation function can not be written explicitly. This such function consist of computational steps. The solution of equations involving the interpolation function is a problem of solution of non linear equation. Newton method will not work on the interpolation function, for the derivative of the interpolation function cannot be written explicitly. This paper shows the use of Secton method to determine the numerical solution of the function involving the interpolation function. The experiment shows the fact that Secton method works better than Newton method in finding the root of Lagrange interpolation function.Keywords: Secton method, interpolation, non linear function, numerical solution
Procedia PDF Downloads 3795993 Examining the Relationship between Chi-Square Test Statistics and Skewness of Weibull Distribution: Simulation Study
Authors: Rafida M. Elobaid
Abstract:
Most of the literature on goodness-of-fit test try to provide a theoretical basis for studying empirical distribution functions. Such goodness-of-fit tests are Kolmogorove-Simirnov and Crumer-Von Mises Type tests. However, it is likely that most of literature has not focused in details on the relationship of the values of the test statistics and skewness or kurtosis. The aim of this study is to investigate the behavior of the values of the χ2 test statistic with the variation of the skewness of right skewed distribution. A simulation study is conducted to generate random numbers from Weibull distribution. For a fixed sample sizes, different levels of skewness are considered, and the corresponding values of the χ2 test statistic are calculated. Using different sample sizes, the results show an inverse relationship between the value of χ2 test and the level of skewness for Wiebull distribution, i.e the value of χ2 test statistic decreases as the value of skewness increases. The research results also show that with large values of skewness we are more confident that the data follows the assumed distribution. Nonparametric Kendall τ test is used to confirm these results.Keywords: goodness-of-fit test, chi-square test, simulation, continuous right skewed distributions
Procedia PDF Downloads 4205992 Pairwise Relative Primality of Integers and Independent Sets of Graphs
Authors: Jerry Hu
Abstract:
Let G = (V, E) with V = {1, 2, ..., k} be a graph, the k positive integers a₁, a₂, ..., ak are G-wise relatively prime if (aᵢ, aⱼ ) = 1 for {i, j} ∈ E. We use an inductive approach to give an asymptotic formula for the number of k-tuples of integers that are G-wise relatively prime. An exact formula is obtained for the probability that k positive integers are G-wise relatively prime. As a corollary, we also provide an exact formula for the probability that k positive integers have exactly r relatively prime pairs.Keywords: graph, independent set, G-wise relatively prime, probability
Procedia PDF Downloads 925991 Statistical Analysis of Cables in Long-Span Cable-Stayed Bridges
Authors: Ceshi Sun, Yueyu Zhao, Yaobing Zhao, Zhiqiang Wang, Jian Peng, Pengxin Guo
Abstract:
With the rapid development of transportation, there are more than 100 cable-stayed bridges with main span larger than 300 m in China. In order to ascertain the statistical relationships among the design parameters of stay cables and their distribution characteristics, 1500 cables were selected from 25 practical long-span cable-stayed bridges. A new relationship between the first order frequency and the length of cable was found by conducting the curve fitting. Then, based on this relationship other interesting relationships were deduced. Several probability density functions (PDFs) were used to investigate the distributions of the parameters of first order frequency, stress level and the Irvine parameter. It was found that these parameters obey the Lognormal distribution, the Weibull distribution and the generalized Pareto distribution, respectively. Scatter diagrams of the three parameters were plotted and their 95% confidence intervals were also investigated.Keywords: cable, cable-stayed bridge, long-span, statistical analysis
Procedia PDF Downloads 6335990 Decision Making, Reward Processing and Response Selection
Authors: Benmansour Nassima, Benmansour Souheyla
Abstract:
The appropriate integration of reward processing and decision making provided by the environment is vital for behavioural success and individuals’ well being in everyday life. Functional neurological investigation has already provided an inclusive image on affective and emotional (motivational) processing in the healthy human brain and has recently focused its interest also on the assessment of brain function in anxious and depressed individuals. This article offers an overview on the theoretical approaches that relate emotion and decision-making, and spotlights investigation with anxious or depressed individuals to reveal how emotions can interfere with decision-making. This research aims at incorporating the emotional structure based on response and stimulation with a Bayesian approach to decision-making in terms of probability and value processing. It seeks to show how studies of individuals with emotional dysfunctions bear out that alterations of decision-making can be considered in terms of altered probability and value subtraction. The utmost objective is to critically determine if the probabilistic representation of belief affords could be a critical approach to scrutinize alterations in probability and value representation in subjective with anxiety and depression, and draw round the general implications of this approach.Keywords: decision-making, motivation, alteration, reward processing, response selection
Procedia PDF Downloads 4775989 Personality Traits, Probability of Marital Infidelity and Risk of Divorce
Authors: Bahareh Zare
Abstract:
The theory of the investment model of dating infidelity maintains that loyalty is an essential power within romantic relationships. Loyalty signifies both motivation and psychological attachment to maintain a relationship. This study examined the relationship between the Big Five Personality Factors (Extraversion, Neuroticism, Openness, Conscientiousness, and Agreeableness), probability of marital infidelity, and risk of divorce. The participants completed NEO-FFI, INFQ (infidelity questionnaire) and were interviewed by OHI (Oral History Interview). The results demonstrated that extraversion and agreeableness traits were significant predictors for the probability of infidelity and risk of divorce. In addition, conscientiousness predicted the probability of infidelity, while neuroticism predicted the risk of divorce.Keywords: five factors personality, infidelity, risk of divorce, investment theory
Procedia PDF Downloads 935988 Effect of Specimen Thickness on Probability Distribution of Grown Crack Size in Magnesium Alloys
Authors: Seon Soon Choi
Abstract:
The fatigue crack growth is stochastic because of the fatigue behavior having an uncertainty and a randomness. Therefore, it is necessary to determine the probability distribution of a grown crack size at a specific fatigue crack propagation life for maintenance of structure as well as reliability estimation. The essential purpose of this study is to present the good probability distribution fit for the grown crack size at a specified fatigue life in a rolled magnesium alloy under different specimen thickness conditions. Fatigue crack propagation experiments are carried out in laboratory air under three conditions of specimen thickness using AZ31 to investigate a stochastic crack growth behavior. The goodness-of-fit test for probability distribution of a grown crack size under different specimen thickness conditions is performed by Anderson-Darling test. The effect of a specimen thickness on variability of a grown crack size is also investigated.Keywords: crack size, fatigue crack propagation, magnesium alloys, probability distribution, specimen thickness
Procedia PDF Downloads 4995987 Throughput of Point Coordination Function (PCF)
Authors: Faisel Eltuhami Alzaalik, Omar Imhemed Alramli, Ahmed Mohamed Elaieb
Abstract:
The IEEE 802.11 defines two modes of MAC, distributed coordination function (DCF) and point coordination function (PCF) mode. The first sub-layer of the MAC is the distributed coordination function (DCF). A contention algorithm is used via DCF to provide access to all traffic. The point coordination function (PCF) is the second sub-layer used to provide contention-free service. PCF is upper DCF and it uses features of DCF to establish guarantee access of its users. Some papers and researches that have been published in this technology were reviewed in this paper, as well as talking briefly about the distributed coordination function (DCF) technology. The simulation of the PCF function have been applied by using a simulation program called network simulator (NS2) and have been found out the throughput of a transmitter system by using this function.Keywords: DCF, PCF, throughput, NS2
Procedia PDF Downloads 5775986 Daily Probability Model of Storm Events in Peninsular Malaysia
Authors: Mohd Aftar Abu Bakar, Noratiqah Mohd Ariff, Abdul Aziz Jemain
Abstract:
Storm Event Analysis (SEA) provides a method to define rainfalls events as storms where each storm has its own amount and duration. By modelling daily probability of different types of storms, the onset, offset and cycle of rainfall seasons can be determined and investigated. Furthermore, researchers from the field of meteorology will be able to study the dynamical characteristics of rainfalls and make predictions for future reference. In this study, four categories of storms; short, intermediate, long and very long storms; are introduced based on the length of storm duration. Daily probability models of storms are built for these four categories of storms in Peninsular Malaysia. The models are constructed by using Bernoulli distribution and by applying linear regression on the first Fourier harmonic equation. From the models obtained, it is found that daily probability of storms at the Eastern part of Peninsular Malaysia shows a unimodal pattern with high probability of rain beginning at the end of the year and lasting until early the next year. This is very likely due to the Northeast monsoon season which occurs from November to March every year. Meanwhile, short and intermediate storms at other regions of Peninsular Malaysia experience a bimodal cycle due to the two inter-monsoon seasons. Overall, these models indicate that Peninsular Malaysia can be divided into four distinct regions based on the daily pattern for the probability of various storm events.Keywords: daily probability model, monsoon seasons, regions, storm events
Procedia PDF Downloads 3435985 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy
Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu
Abstract:
The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis
Procedia PDF Downloads 655984 Comparison of Receiver Operating Characteristic Curve Smoothing Methods
Authors: D. Sigirli
Abstract:
The Receiver Operating Characteristic (ROC) curve is a commonly used statistical tool for evaluating the diagnostic performance of screening and diagnostic test with continuous or ordinal scale results which aims to predict the presence or absence probability of a condition, usually a disease. When the test results were measured as numeric values, sensitivity and specificity can be computed across all possible threshold values which discriminate the subjects as diseased and non-diseased. There are infinite numbers of possible decision thresholds along the continuum of the test results. The ROC curve presents the trade-off between sensitivity and the 1-specificity as the threshold changes. The empirical ROC curve which is a non-parametric estimator of the ROC curve is robust and it represents data accurately. However, especially for small sample sizes, it has a problem of variability and as it is a step function there can be different false positive rates for a true positive rate value and vice versa. Besides, the estimated ROC curve being in a jagged form, since the true ROC curve is a smooth curve, it underestimates the true ROC curve. Since the true ROC curve is assumed to be smooth, several smoothing methods have been explored to smooth a ROC curve. These include using kernel estimates, using log-concave densities, to fit parameters for the specified density function to the data with the maximum-likelihood fitting of univariate distributions or to create a probability distribution by fitting the specified distribution to the data nd using smooth versions of the empirical distribution functions. In the present paper, we aimed to propose a smooth ROC curve estimation based on the boundary corrected kernel function and to compare the performances of ROC curve smoothing methods for the diagnostic test results coming from different distributions in different sample sizes. We performed simulation study to compare the performances of different methods for different scenarios with 1000 repetitions. It is seen that the performance of the proposed method was typically better than that of the empirical ROC curve and only slightly worse compared to the binormal model when in fact the underlying samples were generated from the normal distribution.Keywords: empirical estimator, kernel function, smoothing, receiver operating characteristic curve
Procedia PDF Downloads 1525983 On Coverage Probability of Confidence Intervals for the Normal Mean with Known Coefficient of Variation
Authors: Suparat Niwitpong, Sa-aat Niwitpong
Abstract:
Statistical inference of normal mean with known coefficient of variation has been investigated recently. This phenomenon occurs normally in environment and agriculture experiments when the scientist knows the coefficient of variation of their experiments. In this paper, we constructed new confidence intervals for the normal population mean with known coefficient of variation. We also derived analytic expressions for the coverage probability of each confidence interval. To confirm our theoretical results, Monte Carlo simulation will be used to assess the performance of these intervals based on their coverage probabilities.Keywords: confidence interval, coverage probability, expected length, known coefficient of variation
Procedia PDF Downloads 3925982 Time-Dependent Reliability Analysis of Corrosion Affected Cast Iron Pipes with Mixed Mode Fracture
Authors: Chun-Qing Li, Guoyang Fu, Wei Yang
Abstract:
A significant portion of current water networks is made of cast iron pipes. Due to aging and deterioration with corrosion being the most predominant mechanism, the failure rate of cast iron pipes is very high. Although considerable research has been carried out in the past few decades, most are on the effect of corrosion on the structural capacity of pipes using strength theory as the failure criterion. This paper presents a reliability-based methodology for the assessment of corrosion affected cast iron pipe cracking failures. A nonlinear limit state function taking into account all three fracture modes is proposed for brittle metal pipes with mixed mode fracture. A stochastic model of the load effect is developed, and time-dependent reliability method is employed to quantify the probability of failure and predict the remaining service life. A case study is carried out using the proposed methodology, followed by sensitivity analysis to investigate the effects of the random variables on the probability of failure. It has been found that the larger the inclination angle or the Mode I fracture toughness is, the smaller the probability of pipe failure is. It has also been found that the multiplying and exponential coefficients k and n in the power law corrosion model and the internal pressure have the most influence on the probability of failure for cast iron pipes. The methodology presented in this paper can assist pipe engineers and asset managers in developing a risk-informed and cost-effective strategy for better management of corrosion-affected pipelines.Keywords: corrosion, inclined surface cracks, pressurized cast iron pipes, stress intensity
Procedia PDF Downloads 3215981 Effect of Load Ratio on Probability Distribution of Fatigue Crack Propagation Life in Magnesium Alloys
Authors: Seon Soon Choi
Abstract:
It is necessary to predict a fatigue crack propagation life for estimation of structural integrity. Because of an uncertainty and a randomness of a structural behavior, it is also required to analyze stochastic characteristics of the fatigue crack propagation life at a specified fatigue crack size. The essential purpose of this study is to present the good probability distribution fit for the fatigue crack propagation life at a specified fatigue crack size in magnesium alloys under various fatigue load ratio conditions. To investigate a stochastic crack growth behavior, fatigue crack propagation experiments are performed in laboratory air under several conditions of fatigue load ratio using AZ31. By Anderson-Darling test, a goodness-of-fit test for probability distribution of the fatigue crack propagation life is performed and the good probability distribution fit for the fatigue crack propagation life is presented. The effect of load ratio on variability of fatigue crack propagation life is also investigated.Keywords: fatigue crack propagation life, load ratio, magnesium alloys, probability distribution
Procedia PDF Downloads 6495980 Stochastic Repair and Replacement with a Single Repair Channel
Authors: Mohammed A. Hajeeh
Abstract:
This paper examines the behavior of a system, which upon failure is either replaced with certain probability p or imperfectly repaired with probability q. The system is analyzed using Kolmogorov's forward equations method; the analytical expression for the steady state availability is derived as an indicator of the system’s performance. It is found that the analysis becomes more complex as the number of imperfect repairs increases. It is also observed that the availability increases as the number of states and replacement probability increases. Using such an approach in more complex configurations and in dynamic systems is cumbersome; therefore, it is advisable to resort to simulation or heuristics. In this paper, an example is provided for demonstration.Keywords: repairable models, imperfect, availability, exponential distribution
Procedia PDF Downloads 2875979 Probabilistic Health Risk Assessment of Polycyclic Aromatic Hydrocarbons in Repeatedly Used Edible Oils and Finger Foods
Authors: Suraj Sam Issaka, Anita Asamoah, Abass Gibrilla, Joseph Richmond Fianko
Abstract:
Polycyclic aromatic hydrocarbons (PAHs) are a group of organic compounds that can form in edible oils during repeated frying and accumulate in fried foods. This study assesses the chances of health risks (carcinogenic and non-carcinogenic) due to PAHs levels in popular finger foods (bean cakes, plantain chips, doughnuts) fried in edible oils (mixed vegetable, sunflower, soybean) from the Ghanaian market. Employing probabilistic health risk assessment that considers variability and uncertainty in exposure and risk estimates provides a more realistic representation of potential health risks. Monte Carlo simulations with 10,000 iterations were used to estimate carcinogenic, mutagenic, and non-carcinogenic risks for different age groups (A: 6-10 years, B: 11-20 years, C: 20-70 years), food types (bean cake, plantain chips, doughnut), oil types (soybean, mixed vegetable, sunflower), and re-usage frying oil frequencies (once, twice, thrice). Our results suggest that, for age Group A, doughnuts posed the highest probability of carcinogenic risk (91.55%) exceeding the acceptable threshold, followed by bean cakes (43.87%) and plantain chips (7.72%), as well as the highest probability of unacceptable mutagenic risk (89.2%), followed by bean cakes (40.32%). Among age Group B, doughnuts again had the highest probability of exceeding carcinogenic risk limits (51.16%) and mutagenic risk limits (44.27%). At the same time, plantain chips exhibited the highest maximum carcinogenic risk. For adults age Group C, bean cakes had the highest probability of unacceptable carcinogenic (50.88%) and mutagenic risks (46.44%), though plantain chips showed the highest maximum values for both carcinogenic and mutagenic risks in this age group. Also, on non-carcinogenic risks across different age groups, it was found that age Group A) who consumed doughnuts had a 68.16% probability of a hazard quotient (HQ) greater than 1, suggesting potential cognitive impairment and lower IQ scores due to early PAH exposure. This group also faced risks from consuming plantain chips and bean cake. For age Group B, the consumption of plantain chips was associated with a 36.98% probability of HQ greater than 1, indicating a potential risk of reduced lung function. In age Group C, the consumption of plantain chips was linked to a 35.70% probability of HQ greater than 1, suggesting a potential risk of cardiovascular diseases.Keywords: PAHs, fried foods, carcinogenic risk, non-carcinogenic risk, Monte Carlo simulations
Procedia PDF Downloads 125978 Constructions of Linear and Robust Codes Based on Wavelet Decompositions
Authors: Alla Levina, Sergey Taranov
Abstract:
The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability
Procedia PDF Downloads 4895977 Conservativeness of Probabilistic Constrained Optimal Control Method for Unknown Probability Distribution
Authors: Tomoaki Hashimoto
Abstract:
In recent decades, probabilistic constrained optimal control problems have attracted much attention in many research field. Although probabilistic constraints are generally intractable in an optimization problem, several tractable methods haven been proposed to handle probabilistic constraints. In most methods, probabilistic constraints are reduced to deterministic constraints that are tractable in an optimization problem. However, there is a gap between the transformed deterministic constraints in case of known and unknown probability distribution. This paper examines the conservativeness of probabilistic constrained optimization method with the unknown probability distribution. The objective of this paper is to provide a quantitative assessment of the conservatism for tractable constraints in probabilistic constrained optimization with the unknown probability distribution.Keywords: optimal control, stochastic systems, discrete time systems, probabilistic constraints
Procedia PDF Downloads 5805976 The Falling Point of Lubricant
Authors: Arafat Husain
Abstract:
The lubricants are one of the most used resource in today’s world. Lot of the superpowers are dependent on the lubricant resource for their country to function. To see that the lubricants are not adulterated we need to develop some efficient ways and to see which fluid has been added to the lubricant. So to observe the these malpractices in the lubricant we need to develop a method. We take a elastic ball and through it at probability circle in the submerged in the lubricant at a fixed force and see the distance of pitching and the point of fall. Then we the ratio of distance of falling to the distance of pitching and if the measured ratio is greater than one the fluid is less viscous and if the ratio is lesser than the lubricant is viscous. We will check the falling point of pure lubricant at fixed force and every pure lubricant would have a fixed falling point. After that we would adulterate the lubricant and note the falling point and if the falling point is less than the standard value then adulterate is solid and if the adulterate is liquid the falling point will be more than the standard value. Hence the comparison with the standard falling point will give the efficiency of the lubricant.Keywords: falling point of lubricant, falling point ratios, probability circle, octane number
Procedia PDF Downloads 4955975 Chlorine Pretreatment Effect on Mechanical Properties of Optical Fiber Glass
Authors: Abhinav Srivastava, Hima Harode, Chandan Kumar Saha
Abstract:
The principal ingredient of an optical fiber is quartz glass. The quality of the optical fiber decreases if impure foreign substances are attached to its preform surface. If residual strain inside a preform is significant, it cracks with a small impact during drawing or transporting. Furthermore, damages and unevenness on the surface of an optical fiber base material break the fiber during drawing. The present work signifies that chlorine pre-treatment enhances mechanical properties of the optical fiber glass. FTIR (Fourier-Transform Infrared Spectroscopy) results show that chlorine gas chemically modifies the structure of silica clad; chlorine is known to soften glass. Metallic impurities on the preform surface likely formed volatile metal chlorides due to chlorine pretreatment at elevated temperature. The chlorine also acts as a drying agent, and therefore the preform surface is anticipated to be water deficient and supposedly avoids particle adhesion on the glass surface. The Weibull analysis of long length tensile strength demarcates a substantial shift in its knee. The higher dynamic fatigue n-value also indicated surface crack healing.Keywords: mechanical strength, optical fiber glass, FTIR, Weibull analysis
Procedia PDF Downloads 176