Search results for: probabilistic sampling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3122

Search results for: probabilistic sampling

3062 Application of Adaptive Particle Filter for Localizing a Mobile Robot Using 3D Camera Data

Authors: Maysam Shahsavari, Seyed Jamalaldin Haddadi

Abstract:

There are several methods to localize a mobile robot such as relative, absolute and probabilistic. In this paper, particle filter due to its simple implementation and the fact that it does not need to know to the starting position will be used. This method estimates the position of the mobile robot using a probabilistic distribution, relying on a known map of the environment instead of predicting it. Afterwards, it updates this estimation by reading input sensors and control commands. To receive information from the surrounding world, distance to obstacles, for example, a Kinect is used which is much cheaper than a laser range finder. Finally, after explaining the Adaptive Particle Filter method and its implementation in detail, we will compare this method with the dead reckoning method and show that this method is much more suitable for situations in which we have a map of the environment.

Keywords: particle filter, localization, methods, odometry, kinect

Procedia PDF Downloads 240
3061 The Establishment of Probabilistic Risk Assessment Analysis Methodology for Dry Storage Concrete Casks Using SAPHIRE 8

Authors: J. R. Wang, W. Y. Cheng, J. S. Yeh, S. W. Chen, Y. M. Ferng, J. H. Yang, W. S. Hsu, C. Shih

Abstract:

To understand the risk for dry storage concrete casks in the cask loading, transfer, and storage phase, the purpose of this research is to establish the probabilistic risk assessment (PRA) analysis methodology for dry storage concrete casks by using SAPHIRE 8 code. This analysis methodology is used to perform the study of Taiwan nuclear power plants (NPPs) dry storage system. The process of research has three steps. First, the data of the concrete casks and Taiwan NPPs are collected. Second, the PRA analysis methodology is developed by using SAPHIRE 8. Third, the PRA analysis is performed by using this methodology. According to the analysis results, the maximum risk is the multipurpose canister (MPC) drop case.

Keywords: PRA, dry storage, concrete cask, SAPHIRE

Procedia PDF Downloads 188
3060 Numerical Simulations on Feasibility of Stochastic Model Predictive Control for Linear Discrete-Time Systems with Random Dither Quantization

Authors: Taiki Baba, Tomoaki Hashimoto

Abstract:

The random dither quantization method enables us to achieve much better performance than the simple uniform quantization method for the design of quantized control systems. Motivated by this fact, the stochastic model predictive control method in which a performance index is minimized subject to probabilistic constraints imposed on the state variables of systems has been proposed for linear feedback control systems with random dither quantization. In other words, a method for solving optimal control problems subject to probabilistic state constraints for linear discrete-time control systems with random dither quantization has been already established. To our best knowledge, however, the feasibility of such a kind of optimal control problems has not yet been studied. Our objective in this paper is to investigate the feasibility of stochastic model predictive control problems for linear discrete-time control systems with random dither quantization. To this end, we provide the results of numerical simulations that verify the feasibility of stochastic model predictive control problems for linear discrete-time control systems with random dither quantization.

Keywords: model predictive control, stochastic systems, probabilistic constraints, random dither quantization

Procedia PDF Downloads 253
3059 Seismic Directionality Effects on In-Structure Response Spectra in Seismic Probabilistic Risk Assessment

Authors: Sittipong Jarernprasert, Enrique Bazan-Zurita, Paul C. Rizzo

Abstract:

Currently, seismic probabilistic risk assessments (SPRA) for nuclear facilities use In-Structure Response Spectra (ISRS) in the calculation of fragilities for systems and components. ISRS are calculated via dynamic analyses of the host building subjected to two orthogonal components of horizontal ground motion. Each component is defined as the median motion in any horizontal direction. Structural engineers applied the components along selected X and Y Cartesian axes. The ISRS at different locations in the building are also calculated in the X and Y directions. The choice of the directions of X and Y are not specified by the ground motion model with respect to geographic coordinates, and are rather arbitrarily selected by the structural engineer. Normally, X and Y coincide with the “principal” axes of the building, in the understanding that this practice is generally conservative. For SPRA purposes, however, it is desirable to remove any conservatism in the estimates of median ISRS. This paper examines the effects of the direction of horizontal seismic motion on the ISRS on typical nuclear structure. We also evaluate the variability of ISRS calculated along different horizontal directions. Our results indicate that some central measures of the ISRS provide robust estimates that are practically independent of the selection of the directions of the horizontal Cartesian axes.

Keywords: seismic, directionality, in-structure response spectra, probabilistic risk assessment

Procedia PDF Downloads 391
3058 Automated Detection of Related Software Changes by Probabilistic Neural Networks Model

Authors: Yuan Huang, Xiangping Chen, Xiaonan Luo

Abstract:

Current software are continuously updating. The change between two versions usually involves multiple program entities (e.g., packages, classes, methods, attributes) with multiple purposes (e.g., changed requirements, bug fixing). It is hard for developers to understand which changes are made for the same purpose. Whether two changes are related is not decided by the relationship between this two entities in the program. In this paper, we summarized 4 coupling rules(16 instances) and 4 state-combination types at the class, method and attribute levels for software change. Related Change Vector (RCV) are defined based on coupling rules and state-combination types, and applied to classify related software changes by using Probabilistic Neural Network during a software updating.

Keywords: PNN, related change, state-combination, logical coupling, software entity

Procedia PDF Downloads 412
3057 Dynamic Response Analysis of Structure with Random Parameters

Authors: Ahmed Guerine, Ali El Hafidi, Bruno Martin, Philippe Leclaire

Abstract:

In this paper, we propose a method for the dynamic response of multi-storey structures with uncertain-but-bounded parameters. The effectiveness of the proposed method is demonstrated by a numerical example of three-storey structures. This equation is integrated numerically using Newmark’s method. The numerical results are obtained by the proposed method. The simulation accounting the interval analysis method results are compared with a probabilistic approach results. The interval analysis method provides a mean curve that is between an upper and lower bound obtained from the probabilistic approach.

Keywords: multi-storey structure, dynamic response, interval analysis method, random parameters

Procedia PDF Downloads 162
3056 Temporal Variation of PM10-Bound Benzo(a)Pyrene Concentration in an Urban and a Rural Site of Northwestern Hungary

Authors: Zs. Csanádi, A. Szabó Nagy, J. Szabó, J. Erdős

Abstract:

The main objective of this study was to assess the annual concentration and seasonal variation of benzo(a)pyrene (BaP) associated with PM10 in an urban site of Győr and in a rural site of Sarród in the sampling period of 2008–2012. A total of 280 PM10 aerosol samples were collected in each sampling site and analyzed for BaP by gas chromatography method. The BaP concentrations ranged from undetected to 8 ng/m3 with the mean value of 1.01 ng/m3 in the sampling site of Győr, and from undetected to 4.07 ng/m3 with the mean value of 0.52 ng/m3 in the sampling site of Sarród, respectively. Relatively higher concentrations of BaP were detected in samples collected in both sampling sites in the heating seasons compared with non-heating periods. The annual mean BaP concentrations were comparable with the published data of different other Hungarian sites.

Keywords: air quality, benzo(a)pyrene, PAHs, polycyclic aromatic hydrocarbons

Procedia PDF Downloads 367
3055 A Double Acceptance Sampling Plan for Truncated Life Test Having Exponentiated Transmuted Weibull Distribution

Authors: A. D. Abdellatif, A. N. Ahmed, M. E. Abdelaziz

Abstract:

The main purpose of this paper is to design a double acceptance sampling plan under the time truncated life test when the product lifetime follows an exponentiated transmuted Weibull distribution. Here, the motive is to meet both the consumer’s risk and producer’s risk simultaneously at the specified quality levels, while the termination time is specified. A comparison between the results of the double and single acceptance sampling plans is conducted. We demonstrate the applicability of our results to real data sets.

Keywords: double sampling plan, single sampling plan, producer’s risk, consumer’s risk, exponentiated transmuted weibull distribution, time truncated experiment, single, double, Marshal-Olkin

Procedia PDF Downloads 456
3054 Probabilistic-Based Design of Bridges under Multiple Hazards: Floods and Earthquakes

Authors: Kuo-Wei Liao, Jessica Gitomarsono

Abstract:

Bridge reliability against natural hazards such as floods or earthquakes is an interdisciplinary problem that involves a wide range of knowledge. Moreover, due to the global climate change, engineers have to design a structure against the multi-hazard threats. Currently, few of the practical design guideline has included such concept. The bridge foundation in Taiwan often does not have a uniform width. However, few of the researches have focused on safety evaluation of a bridge with a complex pier. Investigation of the scouring depth under such situation is very important. Thus, this study first focuses on investigating and improving the scour prediction formula for a bridge with complicated foundation via experiments and artificial intelligence. Secondly, a probabilistic design procedure is proposed using the established prediction formula for practical engineers under the multi-hazard attacks.

Keywords: bridge, reliability, multi-hazards, scour

Procedia PDF Downloads 345
3053 A Comparative Study on Sampling Techniques of Polynomial Regression Model Based Stochastic Free Vibration of Composite Plates

Authors: S. Dey, T. Mukhopadhyay, S. Adhikari

Abstract:

This paper presents an exhaustive comparative investigation on sampling techniques of polynomial regression model based stochastic natural frequency of composite plates. Both individual and combined variations of input parameters are considered to map the computational time and accuracy of each modelling techniques. The finite element formulation of composites is capable to deal with both correlated and uncorrelated random input variables such as fibre parameters and material properties. The results obtained by Polynomial regression (PR) using different sampling techniques are compared. Depending on the suitability of sampling techniques such as 2k Factorial designs, Central composite design, A-Optimal design, I-Optimal, D-Optimal, Taguchi’s orthogonal array design, Box-Behnken design, Latin hypercube sampling, sobol sequence are illustrated. Statistical analysis of the first three natural frequencies is presented to compare the results and its performance.

Keywords: composite plate, natural frequency, polynomial regression model, sampling technique, uncertainty quantification

Procedia PDF Downloads 483
3052 Probabilistic Analysis of Fiber-Reinforced Infinite Slopes

Authors: Assile Abou Diab, Shadi Najjar

Abstract:

Fiber-reinforcement is an effective soil improvement technique for applications involving the prevention of shallow failures on the slope face and the repair of existing slope failures. A typical application is the stabilization of cohesionless infinite slopes. The objective of this paper is to present a probabilistic, reliability-based methodology (based on Monte Carlo simulations) for the design of a practical fiber-reinforced cohesionless infinite slope, taking into consideration the impact of various sources of uncertainty. Recommendations are made regarding the required factors of safety that need to be used to achieve a given target reliability level. These factors of safety could differ from the traditional deterministic factor of safety.

Keywords: factor of safety, fiber reinforcement, infinite slope, reliability-based design, uncertainty

Procedia PDF Downloads 338
3051 Efficient High Fidelity Signal Reconstruction Based on Level Crossing Sampling

Authors: Negar Riazifar, Nigel G. Stocks

Abstract:

This paper proposes strategies in level crossing (LC) sampling and reconstruction that provide high fidelity signal reconstruction for speech signals; these strategies circumvent the problem of exponentially increasing number of samples as the bit-depth is increased and hence are highly efficient. Specifically, the results indicate that the distribution of the intervals between samples is one of the key factors in the quality of signal reconstruction; including samples with short intervals do not improve the accuracy of the signal reconstruction, whilst samples with large intervals lead to numerical instability. The proposed sampling method, termed reduced conventional level crossing (RCLC) sampling, exploits redundancy between samples to improve the efficiency of the sampling without compromising performance. A reconstruction technique is also proposed that enhances the numerical stability through linear interpolation of samples separated by large intervals. Interpolation is demonstrated to improve the accuracy of the signal reconstruction in addition to the numerical stability. We further demonstrate that the RCLC and interpolation methods can give useful levels of signal recovery even if the average sampling rate is less than the Nyquist rate.

Keywords: level crossing sampling, numerical stability, speech processing, trigonometric polynomial

Procedia PDF Downloads 128
3050 Comparative Study of Estimators of Population Means in Two Phase Sampling in the Presence of Non-Response

Authors: Syed Ali Taqi, Muhammad Ismail

Abstract:

A comparative study of estimators of population means in two phase sampling in the presence of non-response when Unknown population means of the auxiliary variable(s) and incomplete information of study variable y as well as of auxiliary variable(s) is made. Three real data sets of University students, hospital and unemployment are used for comparison of all the available techniques in two phase sampling in the presence of non-response with the newly generalized ratio estimators.

Keywords: two-phase sampling, ratio estimator, product estimator, generalized estimators

Procedia PDF Downloads 207
3049 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy

Authors: Zviad Ghadua, Biswa Bhattacharya

Abstract:

The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.

Keywords: flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina

Procedia PDF Downloads 111
3048 Relationship between Functionality and Cognitive Impairment in Older Adult Women from the Southeast of Mexico

Authors: Estrella C. Damaris, Ingrid A. Olais, Gloria P. Uicab

Abstract:

This study explores the relationship between the level of functionality and cognitive impairment in older adult women from the south-east of Mexico. It is a descriptive, cross-sectional study; performed with 172 participants in total who attended a health institute and live in Merida, Yucatan Mexico. After a non-probabilistic sampling, Barthel and Pfeiffer scales were applied. The results show statistically significant correlation between the cognitive impairment (Pfeiffer) and the levels of independence and function (Barthel) (r =0.489; p =0.001). Both determine a dependence level so they need either a little or a lot of help. Society needs that the older woman be healthy and that the professionals of mental health develop activities to prevent and rehabilitate because cognitive impairment and function are directly related with the quality of life.

Keywords: functionality, cognition, routine activities, cognitive impairment

Procedia PDF Downloads 263
3047 Optimal Mitigation of Slopes by Probabilistic Methods

Authors: D. De-León-Escobedo, D. J. Delgado-Hernández, S. Pérez

Abstract:

A probabilistic formulation to assess the slopes safety under the hazard of strong storms is presented and illustrated through a slope in Mexico. The formulation is based on the classical safety factor (SF) used in practice to appraise the slope stability, but it is introduced the treatment of uncertainties, and the slope failure probability is calculated as the probability that SF<1. As the main hazard is the rainfall on the area, statistics of rainfall intensity and duration are considered and modeled with an exponential distribution. The expected life-cycle cost is assessed by considering a monetary value on the slope failure consequences. Alternative mitigation measures are simulated, and the formulation is used to get the measures driving to the optimal one (minimum life-cycle costs). For the example, the optimal mitigation measure is the reduction on the slope inclination angle.

Keywords: expected life-cycle cost, failure probability, slopes failure, storms

Procedia PDF Downloads 132
3046 Estimation of Population Mean under Random Non-Response in Two-Phase Successive Sampling

Authors: M. Khalid, G. N. Singh

Abstract:

In this paper, we have considered the problem of estimation for population mean, on current (second) occasion in the presence of random non response in two-occasion successive sampling under two phase set-up. Modified exponential type estimators have been proposed, and their properties are studied under the assumptions that numbers of sampling units follow a distribution due to random non response situations. The performances of the proposed estimators are compared with linear combinations of two estimators, (a) sample mean estimator for fresh sample and (b) ratio estimator for matched sample under the complete response situations. Results are demonstrated through empirical studies which present the effectiveness of the proposed estimators. Suitable recommendations have been made to the survey practitioners.

Keywords: successive sampling, random non-response, auxiliary variable, bias, mean square error

Procedia PDF Downloads 493
3045 Spatially Random Sampling for Retail Food Risk Factors Study

Authors: Guilan Huang

Abstract:

In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.

Keywords: geospatial technology, restaurant, retail food risk factor study, spatially random sampling

Procedia PDF Downloads 324
3044 From Responses of Macroinvertebrate Metrics to the Definition of Reference Thresholds

Authors: Hounyèmè Romuald, Mama Daouda, Argillier Christine

Abstract:

The present study focused on the use of benthic macrofauna to define the reference state of an anthropized lagoon (Nokoué-Benin) from the responses of relevant metrics to proxies. The approach used is a combination of a joint species distribution model and Bayesian networks. The joint species distribution model was used to select the relevant metrics and generate posterior probabilities that were then converted into posterior response probabilities for each of the quality classes (pressure levels), which will constitute the conditional probability tables allowing the establishment of the probabilistic graph representing the different causal relationships between metrics and pressure proxies. For the definition of the reference thresholds, the predicted responses for low-pressure levels were read via probability density diagrams. Observations collected during high and low water periods spanning 03 consecutive years (2004-2006), sampling 33 macroinvertebrate taxa present at all seasons and sampling points, and measurements of 14 environmental parameters were used as application data. The study demonstrated reliable inferences, selection of 07 relevant metrics and definition of quality thresholds for each environmental parameter. The relevance of the metrics as well as the reference thresholds for ecological assessment despite the small sample size, suggests the potential for wider applicability of the approach for aquatic ecosystem monitoring and assessment programs in developing countries generally characterized by a lack of monitoring data.

Keywords: pressure proxies, bayesian inference, bioindicators, acadjas, functional traits

Procedia PDF Downloads 58
3043 Sampling and Characterization of Fines Created during the Shredding of Non Hazardous Waste

Authors: Soukaina Oujana, Peggy Zwolinski

Abstract:

Fines are heterogeneous residues created during the shredding of non-hazardous waste. They are one of the most challenging issues faced by recyclers, because they are at the present time considered as non-sortable and non-reusable mixtures destined to landfill. However, fines contain a large amount of recoverable materials that could be recycled or reused for the production of solid recovered fuel. This research is conducted in relation to a project named ValoRABES. The aim is to characterize fines and establish a suitable sorting process in order to extract the materials contained in the mixture and define their suitable recovery paths. This paper will highlight the importance of a good sampling and will propose a sampling methodology for fines characterization. First results about the characterization will be also presented.

Keywords: fines, non-hazardous waste, recovery, shredding residues, waste characterization, waste sampling

Procedia PDF Downloads 169
3042 Rational Probabilistic Method for Calculating Thermal Cracking Risk of Mass Concrete Structures

Authors: Naoyuki Sugihashi, Toshiharu Kishi

Abstract:

The probability of occurrence of thermal cracks in mass concrete in Japan is evaluated by the cracking probability diagram that represents the relationship between the thermal cracking index and the probability of occurrence of cracks in the actual structure. In this paper, we propose a method to directly calculate the cracking probability, following a probabilistic theory by modeling the variance of tensile stress and tensile strength. In this method, the relationship between the variance of tensile stress and tensile strength, the thermal cracking index, and the cracking probability are formulated and presented. In addition, standard deviation of tensile stress and tensile strength was identified, and the method of calculating cracking probability in a general construction controlled environment was also demonstrated.

Keywords: thermal crack control, mass concrete, thermal cracking probability, durability of concrete, calculating method of cracking probability

Procedia PDF Downloads 303
3041 Estimation of Population Mean under Random Non-Response in Two-Occasion Successive Sampling

Authors: M. Khalid, G. N. Singh

Abstract:

In this paper, we have considered the problems of estimation for the population mean on current (second) occasion in two-occasion successive sampling under random non-response situations. Some modified exponential type estimators have been proposed and their properties are studied under the assumptions that the number of sampling unit follows a discrete distribution due to random non-response situations. The performances of the proposed estimators are compared with linear combinations of two estimators, (a) sample mean estimator for fresh sample and (b) ratio estimator for matched sample under the complete response situations. Results are demonstrated through empirical studies which present the effectiveness of the proposed estimators. Suitable recommendations have been made to the survey practitioners.

Keywords: modified exponential estimator, successive sampling, random non-response, auxiliary variable, bias, mean square error

Procedia PDF Downloads 330
3040 Probability Sampling in Matched Case-Control Study in Drug Abuse

Authors: Surya R. Niraula, Devendra B Chhetry, Girish K. Singh, S. Nagesh, Frederick A. Connell

Abstract:

Background: Although random sampling is generally considered to be the gold standard for population-based research, the majority of drug abuse research is based on non-random sampling despite the well-known limitations of this kind of sampling. Method: We compared the statistical properties of two surveys of drug abuse in the same community: one using snowball sampling of drug users who then identified “friend controls” and the other using a random sample of non-drug users (controls) who then identified “friend cases.” Models to predict drug abuse based on risk factors were developed for each data set using conditional logistic regression. We compared the precision of each model using bootstrapping method and the predictive properties of each model using receiver operating characteristics (ROC) curves. Results: Analysis of 100 random bootstrap samples drawn from the snowball-sample data set showed a wide variation in the standard errors of the beta coefficients of the predictive model, none of which achieved statistical significance. One the other hand, bootstrap analysis of the random-sample data set showed less variation, and did not change the significance of the predictors at the 5% level when compared to the non-bootstrap analysis. Comparison of the area under the ROC curves using the model derived from the random-sample data set was similar when fitted to either data set (0.93, for random-sample data vs. 0.91 for snowball-sample data, p=0.35); however, when the model derived from the snowball-sample data set was fitted to each of the data sets, the areas under the curve were significantly different (0.98 vs. 0.83, p < .001). Conclusion: The proposed method of random sampling of controls appears to be superior from a statistical perspective to snowball sampling and may represent a viable alternative to snowball sampling.

Keywords: drug abuse, matched case-control study, non-probability sampling, probability sampling

Procedia PDF Downloads 471
3039 A Novel Probabilistic Spatial Locality of Reference Technique for Automatic Cleansing of Digital Maps

Authors: A. Abdullah, S. Abushalmat, A. Bakshwain, A. Basuhail, A. Aslam

Abstract:

GIS (Geographic Information System) applications require geo-referenced data, this data could be available as databases or in the form of digital or hard-copy agro-meteorological maps. These parameter maps are color-coded with different regions corresponding to different parameter values, converting these maps into a database is not very difficult. However, text and different planimetric elements overlaid on these maps makes an accurate image to database conversion a challenging problem. The reason being, it is almost impossible to exactly replace what was underneath the text or icons; thus, pointing to the need for inpainting. In this paper, we propose a probabilistic inpainting approach that uses the probability of spatial locality of colors in the map for replacing overlaid elements with underlying color. We tested the limits of our proposed technique using non-textual simulated data and compared text removing results with a popular image editing tool using public domain data with promising results.

Keywords: noise, image, GIS, digital map, inpainting

Procedia PDF Downloads 324
3038 Risk and Reliability Based Probabilistic Structural Analysis of Railroad Subgrade Using Finite Element Analysis

Authors: Asif Arshid, Ying Huang, Denver Tolliver

Abstract:

Finite Element (FE) method coupled with ever-increasing computational powers has substantially advanced the reliability of deterministic three dimensional structural analyses of a structure with uniform material properties. However, railways trackbed is made up of diverse group of materials including steel, wood, rock and soil, while each material has its own varying levels of heterogeneity and imperfections. It is observed that the application of probabilistic methods for trackbed structural analysis while incorporating the material and geometric variabilities is deeply underworked. The authors developed and validated a 3-dimensional FE based numerical trackbed model and in this study, they investigated the influence of variability in Young modulus and thicknesses of granular layers (Ballast and Subgrade) on the reliability index (-index) of the subgrade layer. The influence of these factors is accounted for by changing their Coefficients of Variance (COV) while keeping their means constant. These variations are formulated using Gaussian Normal distribution. Two failure mechanisms in subgrade namely Progressive Shear Failure and Excessive Plastic Deformation are examined. Preliminary results of risk-based probabilistic analysis for Progressive Shear Failure revealed that the variations in Ballast depth are the most influential factor for vertical stress at the top of subgrade surface. Whereas, in case of Excessive Plastic Deformations in subgrade layer, the variations in its own depth and Young modulus proved to be most important while ballast properties remained almost indifferent. For both these failure moods, it is also observed that the reliability index for subgrade failure increases with the increase in COV of ballast depth and subgrade Young modulus. The findings of this work is of particular significance in studying the combined effect of construction imperfections and variations in ground conditions on the structural performance of railroad trackbed and evaluating the associated risk involved. In addition, it also provides an additional tool to supplement the deterministic analysis procedures and decision making for railroad maintenance.

Keywords: finite element analysis, numerical modeling, probabilistic methods, risk and reliability analysis, subgrade

Procedia PDF Downloads 109
3037 Finite State Markov Chain Model of Pollutants from Service Stations

Authors: Amina Boukelkoul, Rahil Boukelkoul, Leila Maachia

Abstract:

The cumulative vapors emitted from the service stations may represent a hazard to the environment and the population. Besides fuel spill and their penetration into deep soil layers are the main contributors to soil and ground-water contamination in the vicinity of the petrol stations. The amount of the effluents from the service stations depends on strategy of maintenance and the policy adopted by the management to reduce the pollution. One key of the proposed approach is the idea of managing the effluents from the service stations which can be captured via use of a finite state Markov chain. Such a model can be embedded within a probabilistic operation and maintenance simulation reflecting the action to be done. In this paper, an approach of estimating a probabilistic percentage of the amount of emitted pollutants is presented. The finite state Markov model is used for decision problems with number of determined periods (life cycle) to predict the amount according to various options of operation.

Keywords: environment, markov modeling, pollution, service station

Procedia PDF Downloads 444
3036 Expert Supporting System for Diagnosing Lymphoid Neoplasms Using Probabilistic Decision Tree Algorithm and Immunohistochemistry Profile Database

Authors: Yosep Chong, Yejin Kim, Jingyun Choi, Hwanjo Yu, Eun Jung Lee, Chang Suk Kang

Abstract:

For the past decades, immunohistochemistry (IHC) has been playing an important role in the diagnosis of human neoplasms, by helping pathologists to make a clearer decision on differential diagnosis, subtyping, personalized treatment plan, and finally prognosis prediction. However, the IHC performed in various tumors of daily practice often shows conflicting and very challenging results to interpret. Even comprehensive diagnosis synthesizing clinical, histologic and immunohistochemical findings can be helpless in some twisted cases. Another important issue is that the IHC data is increasing exponentially and more and more information have to be taken into account. For this reason, we reached an idea to develop an expert supporting system to help pathologists to make a better decision in diagnosing human neoplasms with IHC results. We gave probabilistic decision tree algorithm and tested the algorithm with real case data of lymphoid neoplasms, in which the IHC profile is more important to make a proper diagnosis than other human neoplasms. We designed probabilistic decision tree based on Bayesian theorem, program computational process using MATLAB (The MathWorks, Inc., USA) and prepared IHC profile database (about 104 disease category and 88 IHC antibodies) based on WHO classification by reviewing the literature. The initial probability of each neoplasm was set with the epidemiologic data of lymphoid neoplasm in Korea. With the IHC results of 131 patients sequentially selected, top three presumptive diagnoses for each case were made and compared with the original diagnoses. After the review of the data, 124 out of 131 were used for final analysis. As a result, the presumptive diagnoses were concordant with the original diagnoses in 118 cases (93.7%). The major reason of discordant cases was that the similarity of the IHC profile between two or three different neoplasms. The expert supporting system algorithm presented in this study is in its elementary stage and need more optimization using more advanced technology such as deep-learning with data of real cases, especially in differentiating T-cell lymphomas. Although it needs more refinement, it may be used to aid pathological decision making in future. A further application to determine IHC antibodies for a certain subset of differential diagnoses might be possible in near future.

Keywords: database, expert supporting system, immunohistochemistry, probabilistic decision tree

Procedia PDF Downloads 207
3035 Democratic Political Socialization of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok

Authors: Mathinee Khongsatid, Phusit Phukamchanoad, Sakapas Saengchai

Abstract:

This research aims to study the democratic political socialization of the 5th and 6th Graders under the Authority of Dusit District Office, Bangkok by using stratified sampling for probability sampling and using purposive sampling for non-probability sampling to collect data toward the distribution of questionnaires to 300 respondents. This covers all of the schools under the authority of Dusit District Office. The researcher analyzed the data by using descriptive statistics which include arithmetic mean and standard deviation. The result shows that 5th and 6th graders under the authority of Dusit District Office, Bangkok, have displayed some characteristics following democratic political socialization both inside and outside classroom as well as outside school. However, the democratic political socialization in classroom through grouping and class participation is much more emphasized.

Keywords: democratic, political socialization, students grades 5-6, descriptive statistics

Procedia PDF Downloads 257
3034 The Probability Foundation of Fundamental Theoretical Physics

Authors: Quznetsov Gunn

Abstract:

In the study of the logical foundations of probability theory, it was found that the terms and equations of the fundamental theoretical physics represent terms and theorems of the classical probability theory, more precisely, of that part of this theory, which considers the probability of dot events in the 3 + 1 space-time. In particular, the masses, moments, energies, spins, etc. turn out of parameters of probability distributions such events. The terms and the equations of the electroweak and of the quark-gluon theories turn out the theoretical-probabilistic terms and theorems. Here the relation of a neutrino to his lepton becomes clear, the W and Z bosons masses turn out dynamic ones, the cause of the asymmetry between particles and antiparticles is the impossibility of the birth of single antiparticles. In addition, phenomena such as confinement and asymptotic freedom receive their probabilistic explanation. And here we have the logical foundations of the gravity theory with phenomena dark energy and dark matter.

Keywords: classical theory of probability, logical foundation of fundamental theoretical physics, masses, moments, energies, spins

Procedia PDF Downloads 274
3033 Assessment and Control for Oil Aerosol

Authors: Chane-Yu Lai, Xiang-Yu Huang

Abstract:

This study conducted an assessment of sampling result by using the new development rotation filtration device (RFD) filled with porous media filters integrating the method of cyclone centrifugal spins. The testing system established for the experiment used corn oil and potassium sodium tartrate tetrahydrate (PST) as challenge aerosols and were produced by using an Ultrasonic Atomizing Nozzle, a Syringe Pump, and a Collison nebulizer. The collection efficiency of RFD for oil aerosol was assessed by using an Aerodynamic Particle Sizer (APS) and a Fidas® Frog. The results of RFD for the liquid particles condition indicated the cutoff size was 1.65 µm and 1.02 µm for rotation of 0 rpm and 9000 rpm, respectively, under an 80 PPI (pores per inch)foam with a thickness of 80 mm, and sampling velocity of 13.5 cm/s. As the experiment increased the foam thickness of RFD, the cutoff size reduced from 1.62 µm to 1.02 µm. However, when increased the foam porosity of RFD, the cutoff size reduced from 1.26 µm to 0.96 µm. Moreover, as increased the sampling velocity of RFD, the cutoff size reduced from 1.02 µm to 0.76 µm. These discrepancies of above cutoff sizes of RFD all had statistical significance (P < 0.05). The cutoff size of RFD for three experimental conditions of generated liquid oil particles, solid PST particles or both liquid oil and solid PST particles was 1.03 µm, 1.02 µm, or 0.99 µm, respectively, under a 80 PPI foam with thickness of 80 mm, rotation of 9000 rpm, and sampling velocity of 13.5 cm/s. In addition, under the best condition of the experiment, two hours of sampling loading, the RFD had better collection efficiency for particle diameter greater than 0.45 µm, under a 94 PPI nickel mesh with a thickness of 68 mm, rotation of 9000 rpm, and sampling velocity of 108.3 cm/s. The experiment concluded that increased the thickness of porous media, face velocity, and porosity of porous media of RFD could increase the collection efficiency of porous media for sampling oil particles. Moreover, increased the rotation speed of RFD also increased the collection efficiency for sampling oil particles. Further investigation is required for those above operation parameters for RFD in this study in the future.

Keywords: oil aerosol, porous media filter, rotation, filtration

Procedia PDF Downloads 374