Search results for: probability distribution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5874

Search results for: probability distribution

5754 Teleconnection between El Nino-Southern Oscillation and Seasonal Flow of the Surma River and Possibilities of Long Range Flood Forecasting

Authors: Monika Saha, A. T. M. Hasan Zobeyer, Nasreen Jahan

Abstract:

El Nino-Southern Oscillation (ENSO) is the interaction between atmosphere and ocean in tropical Pacific which causes inconsistent warm/cold weather in tropical central and eastern Pacific Ocean. Due to the impact of climate change, ENSO events are becoming stronger in recent times, and therefore it is very important to study the influence of ENSO in climate studies. Bangladesh, being in the low-lying deltaic floodplain, experiences the worst consequences due to flooding every year. To reduce the catastrophe of severe flooding events, non-structural measures such as flood forecasting can be helpful in taking adequate precautions and steps. Forecasting seasonal flood with a longer lead time of several months is a key component of flood damage control and water management. The objective of this research is to identify the possible strength of teleconnection between ENSO and river flow of Surma and examine the potential possibility of long lead flood forecasting in the wet season. Surma is one of the major rivers of Bangladesh and is a part of the Surma-Meghna river system. In this research, sea surface temperature (SST) has been considered as the ENSO index and the lead time is at least a few months which is greater than the basin response time. The teleconnection has been assessed by the correlation analysis between July-August-September (JAS) flow of Surma and SST of Nino 4 region of the corresponding months. Cumulative frequency distribution of standardized JAS flow of Surma has also been determined as part of assessing the possible teleconnection. Discharge data of Surma river from 1975 to 2015 is used in this analysis, and remarkable increased value of correlation coefficient between flow and ENSO has been observed from 1985. From the cumulative frequency distribution of the standardized JAS flow, it has been marked that in any year the JAS flow has approximately 50% probability of exceeding the long-term average JAS flow. During El Nino year (warm episode of ENSO) this probability of exceedance drops to 23% and while in La Nina year (cold episode of ENSO) it increases to 78%. Discriminant analysis which is known as 'Categoric Prediction' has been performed to identify the possibilities of long lead flood forecasting. It has helped to categorize the flow data (high, average and low) based on the classification of predicted SST (warm, normal and cold). From the discriminant analysis, it has been found that for Surma river, the probability of a high flood in the cold period is 75% and the probability of a low flood in the warm period is 33%. A synoptic parameter, forecasting index (FI) has also been calculated here to judge the forecast skill and to compare different forecasts. This study will help the concerned authorities and the stakeholders to take long-term water resources decisions and formulate policies on river basin management which will reduce possible damage of life, agriculture, and property.

Keywords: El Nino-Southern Oscillation, sea surface temperature, surma river, teleconnection, cumulative frequency distribution, discriminant analysis, forecasting index

Procedia PDF Downloads 111
5753 Optimized Dynamic Bayesian Networks and Neural Verifier Test Applied to On-Line Isolated Characters Recognition

Authors: Redouane Tlemsani, Redouane, Belkacem Kouninef, Abdelkader Benyettou

Abstract:

In this paper, our system is a Markovien system which we can see it like a Dynamic Bayesian Networks. One of the major interests of these systems resides in the complete training of the models (topology and parameters) starting from training data. The Bayesian Networks are representing models of dubious knowledge on complex phenomena. They are a union between the theory of probability and the graph theory in order to give effective tools to represent a joined probability distribution on a set of random variables. The representation of knowledge bases on description, by graphs, relations of causality existing between the variables defining the field of study. The theory of Dynamic Bayesian Networks is a generalization of the Bayesians networks to the dynamic processes. Our objective amounts finding the better structure which represents the relationships (dependencies) between the variables of a dynamic bayesian network. In applications in pattern recognition, one will carry out the fixing of the structure which obliges us to admit some strong assumptions (for example independence between some variables).

Keywords: Arabic on line character recognition, dynamic Bayesian network, pattern recognition, networks

Procedia PDF Downloads 579
5752 An Analysis of a Queueing System with Heterogeneous Servers Subject to Catastrophes

Authors: M. Reni Sagayaraj, S. Anand Gnana Selvam, R. Reynald Susainathan

Abstract:

This study analyzed a queueing system with blocking and no waiting line. The customers arrive according to a Poisson process and the service times follow exponential distribution. There are two non-identical servers in the system. The queue discipline is FCFS, and the customers select the servers on fastest server first (FSF) basis. The service times are exponentially distributed with parameters μ1 and μ2 at servers I and II, respectively. Besides, the catastrophes occur in a Poisson manner with rate γ in the system. When server I is busy or blocked, the customer who arrives in the system leaves the system without being served. Such customers are called lost customers. The probability of losing a customer was computed for the system. The explicit time dependent probabilities of system size are obtained and a numerical example is presented in order to show the managerial insights of the model. Finally, the probability that arriving customer finds system busy and average number of server busy in steady state are obtained numerically.

Keywords: queueing system, blocking, poisson process, heterogeneous servers, queue discipline FCFS, busy period

Procedia PDF Downloads 476
5751 Probability of Passing the Brake Test at Ministry of Transport Facilities of Each City at Alicante Region from Spain

Authors: Carolina Senabre Blanes, Sergio Valero Verdú, Emilio Velasco SáNchez

Abstract:

This research objective is to obtain a percentage of success for each Ministry of Transport (MOT) facilities of each city of the Alicante region from Comunidad Valenciana from Spain by comparing results obtained by using different brake testers. It has been studied which types of brake tester are being used at each city nowadays. Different types of brake testers are used at each city, and the mechanical engineering staffs from the Miguel Hernández University have studied differences between all of them, and have obtained measures from each type. A percentage of probability of success will be given to each MOT station when you try to pass the exam with the same car with same characteristics and the same wheels. In other words, parameters of the vehicle have been controlled to be the same at all tests; therefore, brake measurements variability will be due to the type of testers could be used at the MOT station. A percentage of probability to pass the brake exam at each city will be given by comparing results of tests.

Keywords: brake tester, Mot station, probability to pass the exam, brake tester characteristics

Procedia PDF Downloads 262
5750 Fault Location Detection in Active Distribution System

Authors: R. Rezaeipour, A. R. Mehrabi

Abstract:

Recent increase of the DGs and microgrids in distribution systems, disturbs the tradition structure of the system. Coordination between protection devices in such a system becomes the concern of the network operators. This paper presents a new method for fault location detection in the active distribution networks, independent of the fault type or its resistance. The method uses synchronized voltage and current measurements at the interconnection of DG units and is able to adapt to changes in the topology of the system. The method has been tested on a 38-bus distribution system, with very encouraging results.

Keywords: fault location detection, active distribution system, micro grids, network operators

Procedia PDF Downloads 748
5749 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 57
5748 Multivariate Control Chart to Determine Efficiency Measurements in Industrial Processes

Authors: J. J. Vargas, N. Prieto, L. A. Toro

Abstract:

Control charts are commonly used to monitor processes involving either variable or attribute of quality characteristics and determining the control limits as a critical task for quality engineers to improve the processes. Nonetheless, in some applications it is necessary to include an estimation of efficiency. In this paper, the ability to define the efficiency of an industrial process was added to a control chart by means of incorporating a data envelopment analysis (DEA) approach. In depth, a Bayesian estimation was performed to calculate the posterior probability distribution of parameters as means and variance and covariance matrix. This technique allows to analyse the data set without the need of using the hypothetical large sample implied in the problem and to be treated as an approximation to the finite sample distribution. A rejection simulation method was carried out to generate random variables from the parameter functions. Each resulting vector was used by stochastic DEA model during several cycles for establishing the distribution of each efficiency measures for each DMU (decision making units). A control limit was calculated with model obtained and if a condition of a low level efficiency of DMU is presented, system efficiency is out of control. In the efficiency calculated a global optimum was reached, which ensures model reliability.

Keywords: data envelopment analysis, DEA, Multivariate control chart, rejection simulation method

Procedia PDF Downloads 348
5747 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.

Keywords: probability-based damage detection (PBDD), Kriging, surrogate modeling, uncertainty quantification, artificial intelligence, enhanced ideal gas molecular movement (EIGMM)

Procedia PDF Downloads 206
5746 A Prediction Model of Tornado and Its Impact on Architecture Design

Authors: Jialin Wu, Zhiwei Lian, Jieyu Tang, Jingyun Shen

Abstract:

Tornado is a serious and unpredictable natural disaster, which has an important impact on people's production and life. The probability of being hit by tornadoes in China was analyzed considering the principles of tornado formation. Then some suggestions on layout and shapes for newly-built buildings were provided combined with the characteristics of tornado wind fields. Fuzzy clustering and inverse closeness methods were used to evaluate the probability levels of tornado risks in various provinces based on classification and ranking. GIS was adopted to display the results. Finally, wind field single-vortex tornado was studied to discuss the optimized design of rural low-rise houses in Yancheng, Jiangsu as an example. This paper may provide enough data to support building and urban design in some specific regions.

Keywords: tornado probability, computational fluid dynamics, fuzzy mathematics, optimal design

Procedia PDF Downloads 98
5745 The Modeling and Effectiveness Evaluation for Vessel Evasion to Acoustic Homing Torpedo

Authors: Li Minghui, Min Shaorong, Zhang Jun

Abstract:

This paper aims for studying the operational efficiency of surface warship’s motorized evasion to acoustic homing torpedo. It orderly developed trajectory model, self-guide detection model, vessel evasion model, as well as anti-torpedo error model in three-dimensional space to make up for the deficiency of precious researches analyzing two-dimensionally confrontational models. Then, making use of the Monte Carlo method, it carried out the simulation for the confrontation process of evasion in the environment of MATLAB. At last, it quantitatively analyzed the main factors which determine vessel’s survival probability. The results show that evasion relative bearing and speed will affect vessel’s survival probability significantly. Thus, choosing appropriate evasion relative bearing and speed according to alarming range and alarming relative bearing for torpedo, improving alarming range and positioning accuracy and reducing the response time against torpedo will improve the vessel’s survival probability significantly.

Keywords: acoustic homing torpedo, vessel evasion, monte carlo method, torpedo defense, vessel's survival probability

Procedia PDF Downloads 418
5744 Determinants of Probability Weighting and Probability Neglect: An Experimental Study of the Role of Emotions, Risk Perception, and Personality in Flood Insurance Demand

Authors: Peter J. Robinson, W. J. Wouter Botzen

Abstract:

Individuals often over-weight low probabilities and under-weight moderate to high probabilities, however very low probabilities are either significantly over-weighted or neglected. Little is known about factors affecting probability weighting in Prospect Theory related to emotions specific to risk (anticipatory and anticipated emotions), the threshold of concern, as well as personality traits like locus of control. This study provides these insights by examining factors that influence probability weighting in the context of flood insurance demand in an economic experiment. In particular, we focus on determinants of flood probability neglect to provide recommendations for improved risk management. In addition, results obtained using real incentives and no performance-based payments are compared in the experiment with high experimental outcomes. Based on data collected from 1’041 Dutch homeowners, we find that: flood probability neglect is related to anticipated regret, worry and the threshold of concern. Moreover, locus of control and regret affect probabilistic pessimism. Nevertheless, we do not observe strong evidence that incentives influence flood probability neglect nor probability weighting. The results show that low, moderate and high flood probabilities are under-weighted, which is related to framing in the flooding context and the degree of realism respondents attach to high probability property damages. We suggest several policies to overcome psychological factors related to under-weighting flood probabilities to improve flood preparations. These include policies that promote better risk communication to enhance insurance decisions for individuals with a high threshold of concern, and education and information provision to change the behaviour of internal locus of control types as well as people who see insurance as an investment. Multi-year flood insurance may also prevent short-sighted behaviour of people who have a tendency to regret paying for insurance. Moreover, bundling low-probability/high-impact risks with more immediate risks may achieve an overall covered risk which is less likely to be judged as falling below thresholds of concern. These measures could aid the development of a flood insurance market in the Netherlands for which we find to be demand.

Keywords: flood insurance demand, prospect theory, risk perceptions, risk preferences

Procedia PDF Downloads 242
5743 Simulation Study on Particle Fluidization and Drying in a Spray Fluidized Bed

Authors: Jinnan Guo, Daoyin Liu

Abstract:

The quality of final products in the coating process significantly depends on particle fluidization and drying in the spray-fluidized bed. In this study, fluidizing gas temperature and velocity are changed, and their effects on particle flow, moisture content, and heat transfer in a spray fluidized bed are investigated by the CFD – Discrete Element Model (DEM). The gas flow velocity distribution of the fluidized bed is symmetrical, with high velocity in the middle and low velocity on both sides. During the heating process, the particles inside the central tube and at the bottom of the bed are rapidly heated. The particle circulation in the annular area is heated slowly and the temperature is low. The inconsistency of particle circulation results in two peaks in the probability density distribution of the particle temperature during the heating process, and the overall temperature of the particles increases uniformly. During the drying process, the distribution of particle moisture transitions from initial uniform moisture to two peaks, and then the number of completely dried (moisture content of 0) particles gradually increases. Increasing the fluidizing gas temperature and velocity improves particle circulation, drying and heat transfer in the bed. The current study provides an effective method for studying the hydrodynamics of spray fluidized beds with simultaneous processes of heating and particle fluidization.

Keywords: heat transfer, CFD-DEM, spray fluidized bed, drying

Procedia PDF Downloads 18
5742 A Stochastic Approach to Extreme Wind Speeds Conditions on a Small Axial Wind Turbine

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, to model a real life wind turbine, a probabilistic approach is proposed to model the dynamics of the blade elements of a small axial wind turbine under extreme stochastic wind speeds conditions. It was found that the power and the torque probability density functions even though decreases at these extreme wind speeds but are not infinite. Moreover, we also found that it is possible to stabilize the power coefficient (stabilizing the output power) above rated wind speeds by turning some control parameters. This method helps to explain the effect of turbulence on the quality and quantity of the harness power and aerodynamic torque.

Keywords: probability, probability density function, stochastic, turbulence

Procedia PDF Downloads 545
5741 The Effect of Human Capital and Oil Revenue on Income Distribution in Real Sample

Authors: Marjan Majdi, MohammadAli Moradi, Elham Samarikhalaj

Abstract:

Income distribution is one of the most topics in macro economic theories. There are many categories in economy such as income distribution that have the most influenced by economic policies. Human capital has an impact on economic growth and it has significant effect on income distributions. The results of this study confirm that the effects of oil revenue and human capital on income distribution are negative and significant but the value of the estimated coefficient is too small in a real sample in period time (1969-2006).

Keywords: gini coefficient, human capital, income distribution, oil revenue

Procedia PDF Downloads 593
5740 Reliability Analysis in Power Distribution System

Authors: R. A. Deshpande, P. Chandhra Sekhar, V. Sankar

Abstract:

In this paper, we discussed the basic reliability evaluation techniques needed to evaluate the reliability of distribution systems which are applied in distribution system planning and operation. Basically, the reliability study can also help to predict the reliability performance of the system after quantifying the impact of adding new components to the system. The number and locations of new components needed to improve the reliability indices to certain limits are identified and studied.

Keywords: distribution system, reliability indices, urban feeder, rural feeder

Procedia PDF Downloads 741
5739 'Call Drop': A Problem for Handover Minimizing the Call Drop Probability Using Analytical and Statistical Method

Authors: Anshul Gupta, T. Shankar

Abstract:

In this paper, we had analyzed the call drop to provide a good quality of service to user. By optimizing it we can increase the coverage area and also the reduction of interference and congestion created in a network. Basically handover is the transfer of call from one cell site to another site during a call. Here we have analyzed the whole network by two method-statistic model and analytic model. In statistic model we have collected all the data of a network during busy hour and normal 24 hours and in analytic model we have the equation through which we have to find the call drop probability. By avoiding unnecessary handovers we can increase the number of calls per hour. The most important parameter is co-efficient of variation on which the whole paper discussed.

Keywords: coefficient of variation, mean, standard deviation, call drop probability, handover

Procedia PDF Downloads 455
5738 Classification of Digital Chest Radiographs Using Image Processing Techniques to Aid in Diagnosis of Pulmonary Tuberculosis

Authors: A. J. S. P. Nileema, S. Kulatunga , S. H. Palihawadana

Abstract:

Computer aided detection (CAD) system was developed for the diagnosis of pulmonary tuberculosis using digital chest X-rays with MATLAB image processing techniques using a statistical approach. The study comprised of 200 digital chest radiographs collected from the National Hospital for Respiratory Diseases - Welisara, Sri Lanka. Pre-processing was done to remove identification details. Lung fields were segmented and then divided into four quadrants; right upper quadrant, left upper quadrant, right lower quadrant, and left lower quadrant using the image processing techniques in MATLAB. Contrast, correlation, homogeneity, energy, entropy, and maximum probability texture features were extracted using the gray level co-occurrence matrix method. Descriptive statistics and normal distribution analysis were performed using SPSS. Depending on the radiologists’ interpretation, chest radiographs were classified manually into PTB - positive (PTBP) and PTB - negative (PTBN) classes. Features with standard normal distribution were analyzed using an independent sample T-test for PTBP and PTBN chest radiographs. Among the six features tested, contrast, correlation, energy, entropy, and maximum probability features showed a statistically significant difference between the two classes at 95% confidence interval; therefore, could be used in the classification of chest radiograph for PTB diagnosis. With the resulting value ranges of the five texture features with normal distribution, a classification algorithm was then defined to recognize and classify the quadrant images; if the texture feature values of the quadrant image being tested falls within the defined region, it will be identified as a PTBP – abnormal quadrant and will be labeled as ‘Abnormal’ in red color with its border being highlighted in red color whereas if the texture feature values of the quadrant image being tested falls outside of the defined value range, it will be identified as PTBN–normal and labeled as ‘Normal’ in blue color but there will be no changes to the image outline. The developed classification algorithm has shown a high sensitivity of 92% which makes it an efficient CAD system and with a modest specificity of 70%.

Keywords: chest radiographs, computer aided detection, image processing, pulmonary tuberculosis

Procedia PDF Downloads 89
5737 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.

Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution

Procedia PDF Downloads 340
5736 Modeling the Current and Future Distribution of Anthus Pratensis under Climate Change

Authors: Zahira Belkacemi

Abstract:

One of the most important tools in conservation biology is information on the geographic distribution of species and the variables determining those patterns. In this study, we used maximum-entropy niche modeling (Maxent) to predict the current and future distribution of Anthus pratensis using climatic variables. The results showed that the species would not be highly affected by the climate change in shifting its distribution; however, the results of this study should be improved by taking into account other predictors, and that the NATURA 2000 protected sites will be efficient at 42% in protecting the species.

Keywords: anthus pratensis, climate change, Europe, species distribution model

Procedia PDF Downloads 100
5735 Classical and Bayesian Inference of the Generalized Log-Logistic Distribution with Applications to Survival Data

Authors: Abdisalam Hassan Muse, Samuel Mwalili, Oscar Ngesa

Abstract:

A generalized log-logistic distribution with variable shapes of the hazard rate was introduced and studied, extending the log-logistic distribution by adding an extra parameter to the classical distribution, leading to greater flexibility in analysing and modeling various data types. The proposed distribution has a large number of well-known lifetime special sub-models such as; Weibull, log-logistic, exponential, and Burr XII distributions. Its basic mathematical and statistical properties were derived. The method of maximum likelihood was adopted for estimating the unknown parameters of the proposed distribution, and a Monte Carlo simulation study is carried out to assess the behavior of the estimators. The importance of this distribution is that its tendency to model both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub shape) or reversed “bathtub” shape hazard rate functions which are quite common in survival and reliability data analysis. Furthermore, the flexibility and usefulness of the proposed distribution are illustrated in a real-life data set and compared to its sub-models; Weibull, log-logistic, and BurrXII distributions and other parametric survival distributions with 3-parmaeters; like the exponentiated Weibull distribution, the 3-parameter lognormal distribution, the 3- parameter gamma distribution, the 3-parameter Weibull distribution, and the 3-parameter log-logistic (also known as shifted log-logistic) distribution. The proposed distribution provided a better fit than all of the competitive distributions based on the goodness-of-fit tests, the log-likelihood, and information criterion values. Finally, Bayesian analysis and performance of Gibbs sampling for the data set are also carried out.

Keywords: hazard rate function, log-logistic distribution, maximum likelihood estimation, generalized log-logistic distribution, survival data, Monte Carlo simulation

Procedia PDF Downloads 165
5734 Probability Model Accidents of Motorcyclist Based on Driver's Personality

Authors: Margareth E. Bolla, Ludfi Djakfar, Achmad Wicaksono

Abstract:

The increase in the number of motorcycle users in Indonesia is in line with the increase in accidents involving motorcycles. Several previous studies have shown that humans are the biggest factor causing accidents, and the driver's personality factor will affect his behavior on the road. This study was conducted to see how a person's personality traits will affect the probability of having an accident while driving. The Big Five Inventory (BFI) questionnaire and the Honda Riding Trainer (HRT) simulator were used as measuring tools, while the analysis carried out was logistic regression analysis. The results of the descriptive analysis of the respondent's personality based on the BFI show that the majority of drivers have the dominant character of neuroticism (34%), while the smallest group is the driver with the dominant type of openness character (6%). The percentage of motorists who were not involved in an accident was 54%. The results of the logistic regression analysis form a mathematical model as follows Y = -3.852 - 0.288 X1 + 0.596 X2 + 0.429 X3 - 0.386 X4 - 0.094 X5 + 0.436 X6 + 0.162 X7, where the results of hypothesis testing indicate that the variables openness, conscientiousness, extraversion, agreeableness, neuroticism, history of traffic accidents and age at starting driving did not have a significant effect on the probability of a motorcyclist being involved in an accident.

Keywords: accidents, BFI, probability, simulator

Procedia PDF Downloads 114
5733 Application Reliability Method for the Analysis of the Stability Limit States of Large Concrete Dams

Authors: Mustapha Kamel Mihoubi, Essadik Kerkar, Abdelhamid Hebbouche

Abstract:

According to the randomness of most of the factors affecting the stability of a gravity dam, probability theory is generally used to TESTING the risk of failure and there is a confusing logical transition from the state of stability failed state, so the stability failure process is considered as a probable event. The control of risk of product failures is of capital importance for the control from a cross analysis of the gravity of the consequences and effects of the probability of occurrence of identified major accidents and can incur a significant risk to the concrete dam structures. Probabilistic risk analysis models are used to provide a better understanding the reliability and structural failure of the works, including when calculating stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of the reliability analysis methods including the methods used in engineering. It is in our case of the use of level II methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type FORM (First Order Reliability Method), SORM (Second Order Reliability Method). By way of comparison, a second level III method was used which generates a full analysis of the problem and involving an integration of the probability density function of, random variables are extended to the field of security by using of the method of Mont-Carlo simulations. Taking into account the change in stress following load combinations: normal, exceptional and extreme the acting on the dam, calculation results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities thus causing a significant decrease in strength, especially in the presence of combinations of unique and extreme loads. Shear forces then induce a shift threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case THE increase of uplift in a hypothetical default of the drainage system.

Keywords: dam, failure, limit state, monte-carlo, reliability, probability, sliding, Taylor

Procedia PDF Downloads 293
5732 Applying the Crystal Model Approach on Light Nuclei for Calculating Radii and Density Distribution

Authors: A. Amar

Abstract:

A new model, namely the crystal model, has been modified to calculate the radius and density distribution of light nuclei up to ⁸Be. The crystal model has been modified according to solid-state physics, which uses the analogy between nucleon distribution and atoms distribution in the crystal. The model has analytical analysis to calculate the radius where the density distribution of light nuclei has obtained from analogy of crystal lattice. The distribution of nucleons over crystal has been discussed in a general form. The equation that has been used to calculate binding energy was taken from the solid-state model of repulsive and attractive force. The numbers of the protons were taken to control repulsive force, where the atomic number was responsible for the attractive force. The parameter has been calculated from the crystal model was found to be proportional to the radius of the nucleus. The density distribution of light nuclei was taken as a summation of two clusters distribution as in ⁶Li=alpha+deuteron configuration. A test has been done on the data obtained for radius and density distribution using double folding for d+⁶,⁷Li with M3Y nucleon-nucleon interaction. Good agreement has been obtained for both the radius and density distribution of light nuclei. The model failed to calculate the radius of ⁹Be, so modifications should be done to overcome discrepancy.

Keywords: nuclear physics, nuclear lattice, study nucleus as crystal, light nuclei till to ⁸Be

Procedia PDF Downloads 140
5731 Evaluation of DNA Paternity Testing Accuracy of Child Trafficking Cases

Authors: Wing Kam Fung, Kexin Yu

Abstract:

Child trafficking has been a serious problem in modern China. The Chinese government has established a national anti-trafficking DNA database to help reunite missing children with their families. The database collects DNA information from missing children's parents, trafficked and homeless children, then conducts paternity tests to find matched pairs. This paper considers the matching accuracy in such cases by looking into the exclusion probability in paternity testing. First, the situation of child trafficking in China is introduced. Next, derivations of the exclusion probability for both one-parent and two-parents cases are given, followed by extension to allow for 1 or 2 mutations. The accuracy of paternity testing of child trafficking cases is then assessed using the exclusion probabilities and available data. Finally, the number of loci that should be used to ensure a correct match is investigated.

Keywords: child trafficking, DNA database, exclusion probability, paternity testing

Procedia PDF Downloads 422
5730 Hybrid EMPCA-Scott Approach for Estimating Probability Distributions of Mutual Information

Authors: Thuvanan Borvornvitchotikarn, Werasak Kurutach

Abstract:

Mutual information (MI) is widely used in medical image registration. In the different medical images analysis, it is difficult to choose an optimal bins size number for calculating the probability distributions in MI. As the result, this paper presents a new adaptive bins number selection approach that named a hybrid EMPCA-Scott approach. This work combines an expectation maximization principal component analysis (EMPCA) and the modified Scott’s rule. The proposed approach solves the binning problem from the various intensity values in medical images. Experimental results of this work show the lower registration errors compared to other adaptive binning approaches.

Keywords: mutual information, EMPCA, Scott, probability distributions

Procedia PDF Downloads 218
5729 An Exploratory Study on 'Sub-Region Life Circle' in Chinese Big Cities Based on Human High-Probability Daily Activity: Characteristic and Formation Mechanism as a Case of Wuhan

Authors: Zhuoran Shan, Li Wan, Xianchun Zhang

Abstract:

With an increasing trend of regionalization and polycentricity in Chinese contemporary big cities, “sub-region life circle” turns to be an effective method on rational organization of urban function and spatial structure. By the method of questionnaire, network big data, route inversion on internet map, GIS spatial analysis and logistic regression, this article makes research on characteristic and formation mechanism of “sub-region life circle” based on human high-probability daily activity in Chinese big cities. Firstly, it shows that “sub-region life circle” has been a new general spatial sphere of residents' high-probability daily activity and mobility in China. Unlike the former analysis of the whole metropolitan or the micro community, “sub-region life circle” has its own characteristic on geographical sphere, functional element, spatial morphology and land distribution. Secondly, according to the analysis result with Binary Logistic Regression Model, the research also shows that seven factors including land-use mixed degree and bus station density impact the formation of “sub-region life circle” most, and then analyzes the index critical value of each factor. Finally, to establish a smarter “sub-region life circle”, this paper indicates that several strategies including jobs-housing fit, service cohesion and space reconstruction are the keys for its spatial organization optimization. This study expands the further understanding of cities' inner sub-region spatial structure based on human daily activity, and contributes to the theory of “life circle” in urban's meso-scale.

Keywords: sub-region life circle, characteristic, formation mechanism, human activity, spatial structure

Procedia PDF Downloads 266
5728 Social Media as a Distribution Channel for Thailand’s Rice Berry Product

Authors: Phutthiwat Waiyawuththanapoom, Wannapong Waiyawuththanapoom, Pimploi Tirastittam

Abstract:

Nowadays, it is a globalization era which social media plays an important role to the lifestyle as an information source, tools to connect people together and etc. This research is object to find out about the significant level of the social media as a distribution channel to the agriculture product of Thailand. In this research, the agriculture product is the Rice Berry which is the cross-bred unmilled rice producing dark violet grain, is a combination of Hom Nin Rice and Thai Jasmine/ Fragrant Rice 105. Rice Berry has a very high nutrition and nice aroma so the product is in the growth stage of the product cycle. The problem for the Rice Berry product in Thailand is the production and the distribution channel. This study is to confirm that the social media is another option as the distribution channel for the product which is not a mass production product. This will be the role model for the other niche market product to select the distribution channel.

Keywords: distribution, social media, rice berry, distribution channel

Procedia PDF Downloads 401
5727 Performance of Nakagami Fading Channel over Energy Detection Based Spectrum Sensing

Authors: M. Ranjeeth, S. Anuradha

Abstract:

Spectrum sensing is the main feature of cognitive radio technology. Spectrum sensing gives an idea of detecting the presence of the primary users in a licensed spectrum. In this paper we compare the theoretical results of detection probability of different fading environments like Rayleigh, Rician, Nakagami-m fading channels with the simulation results using energy detection based spectrum sensing. The numerical results are plotted as P_f Vs P_d for different SNR values, fading parameters. It is observed that Nakagami fading channel performance is better than other fading channels by using energy detection in spectrum sensing. A MATLAB simulation test bench has been implemented to know the performance of energy detection in different fading channel environment.

Keywords: spectrum sensing, energy detection, fading channels, probability of detection, probability of false alarm

Procedia PDF Downloads 495
5726 Young’s Modulus Variability: Influence on Masonry Vault Behavior

Authors: Abdelmounaim Zanaz, Sylvie Yotte, Fazia Fouchal, Alaa Chateauneuf

Abstract:

This paper presents a methodology for probabilistic assessment of bearing capacity and prediction of failure mechanism of masonry vaults at the ultimate state with consideration of the natural variability of Young’s modulus of stones. First, the computation model is explained. The failure mode is the most reported mode, i.e. the four-hinge mechanism. Based on this assumption, the study of a vault composed of 16 segments is presented. The Young’s modulus of the segments is considered as random variable defined by a mean value and a coefficient of variation CV. A relationship linking the vault bearing capacity to the modulus variation of voussoirs is proposed. The failure mechanisms, in addition to that observed in the deterministic case, are identified for each CV value as well as their probability of occurrence. The results show that the mechanism observed in the deterministic case has decreasing probability of occurrence in terms of CV, while the number of other mechanisms and their probability of occurrence increase with the coefficient of variation of Young’s modulus. This means that if a significant change in the Young modulus of the segments is proven, taken it into account in computations becomes mandatory, both for determining the vault bearing capacity and for predicting its failure mechanism.

Keywords: masonry, mechanism, probability, variability, vault

Procedia PDF Downloads 413
5725 Effect of Correlation of Random Variables on Structural Reliability Index

Authors: Agnieszka Dudzik

Abstract:

The problem of correlation between random variables in the structural reliability analysis has been extensively discussed in literature on the subject. The cases taken under consideration were usually related to correlation between random variables from one side of ultimate limit state: correlation between particular loads applied on structure or correlation between resistance of particular members of a structure as a system. It has been proved that positive correlation between these random variables reduces the reliability of structure and increases the probability of failure. In the paper, the problem of correlation between random variables from both side of the limit state equation will be taken under consideration. The simplest case where these random variables are of the normal distributions will be concerned. The case when a degree of that correlation is described by the covariance or the coefficient of correlation will be used. Special attention will be paid on questions: how much that correlation changes the reliability level and can it be ignored. In reliability analysis will be used well-known methods for assessment of the failure probability: based on the Hasofer-Lind reliability index and Monte Carlo method adapted to the problem of correlation. The main purpose of this work will be a presentation how correlation of random variables influence on reliability index of steel bar structures. Structural design parameters will be defined as deterministic values and random variables. The latter will be correlated. The criterion of structural failure will be expressed by limit functions related to the ultimate and serviceability limit state. In the description of random variables will be used only for the normal distribution. Sensitivity of reliability index to the random variables will be defined. If the reliability index sensitivity due to the random variable X will be low when compared with other variables, it can be stated that the impact of this variable on failure probability is small. Therefore, in successive computations, it can be treated as a deterministic parameter. Sensitivity analysis leads to simplify the description of the mathematical model, determine the new limit functions and values of the Hasofer-Lind reliability index. In the examples, the NUMPRESS software will be used in the reliability analysis.

Keywords: correlation of random variables, reliability index, sensitivity of reliability index, steel structure

Procedia PDF Downloads 201