Search results for: probability weighting functions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3699

Search results for: probability weighting functions

3669 Quantum Mechanism Approach for Non-Ruin Probability and Comparison of Path Integral Method and Stochastic Simulations

Authors: Ahmet Kaya

Abstract:

Quantum mechanism is one of the most important approaches to calculating non-ruin probability. We apply standard Dirac notation to model given Hamiltonians. By using the traditional method and eigenvector basis, non-ruin probability is found for several examples. Also, non-ruin probability is calculated for two different Hamiltonian by using the tensor product. Finally, the path integral method is applied to the examples and comparison is made for stochastic simulations and path integral calculation.

Keywords: quantum physics, Hamiltonian system, path integral, tensor product, ruin probability

Procedia PDF Downloads 304
3668 The Behavior of The Zeros of Bargmann Analytic Functions for Multiple-Mode Systems

Authors: Muna Tabuni

Abstract:

The paper contains an investigation of the behavior of the Zeros of Bargmann functions for one and two-mode systems. A brief introduction to Harmonic oscillator formalism for one and two-mode is given. The Bargmann analytic representation for one and two-mode has been studied. The zeros of Bargmann analytic function for one-mode are considered. The Q Husimi functions are introduced. The Bargmann functions and the Husimi functions have the same zeros. The Bargmann functions f(z) have exactly q zeros. The evolution time of the zeros are discussed. The zeros of Bargmann analytic functions for two-mode are introduced. Various examples have been given.

Keywords: Bargmann functions, two-mode, zeros, harmonic oscillator

Procedia PDF Downloads 545
3667 A Semi-Markov Chain-Based Model for the Prediction of Deterioration of Concrete Bridges in Quebec

Authors: Eslam Mohammed Abdelkader, Mohamed Marzouk, Tarek Zayed

Abstract:

Infrastructure systems are crucial to every aspect of life on Earth. Existing Infrastructure is subjected to degradation while the demands are growing for a better infrastructure system in response to the high standards of safety, health, population growth, and environmental protection. Bridges play a crucial role in urban transportation networks. Moreover, they are subjected to high level of deterioration because of the variable traffic loading, extreme weather conditions, cycles of freeze and thaw, etc. The development of Bridge Management Systems (BMSs) has become a fundamental imperative nowadays especially in the large transportation networks due to the huge variance between the need for maintenance actions, and the available funds to perform such actions. Deterioration models represent a very important aspect for the effective use of BMSs. This paper presents a probabilistic time-based model that is capable of predicting the condition ratings of the concrete bridge decks along its service life. The deterioration process of the concrete bridge decks is modeled using semi-Markov process. One of the main challenges of the Markov Chain Decision Process (MCDP) is the construction of the transition probability matrix. Yet, the proposed model overcomes this issue by modeling the sojourn times based on some probability density functions. The sojourn times of each condition state are fitted to probability density functions based on some goodness of fit tests such as Kolmogorov-Smirnov test, Anderson Darling, and chi-squared test. The parameters of the probability density functions are obtained using maximum likelihood estimation (MLE). The condition ratings obtained from the Ministry of Transportation in Quebec (MTQ) are utilized as a database to construct the deterioration model. Finally, a comparison is conducted between the Markov Chain and semi-Markov chain to select the most feasible prediction model.

Keywords: bridge management system, bridge decks, deterioration model, Semi-Markov chain, sojourn times, maximum likelihood estimation

Procedia PDF Downloads 181
3666 A Decision Support System to Detect the Lumbar Disc Disease on the Basis of Clinical MRI

Authors: Yavuz Unal, Kemal Polat, H. Erdinc Kocer

Abstract:

In this study, a decision support system comprising three stages has been proposed to detect the disc abnormalities of the lumbar region. In the first stage named the feature extraction, T2-weighted sagittal and axial Magnetic Resonance Images (MRI) were taken from 55 people and then 27 appearance and shape features were acquired from both sagittal and transverse images. In the second stage named the feature weighting process, k-means clustering based feature weighting (KMCBFW) proposed by Gunes et al. Finally, in the third stage named the classification process, the classifier algorithms including multi-layer perceptron (MLP- neural network), support vector machine (SVM), Naïve Bayes, and decision tree have been used to classify whether the subject has lumbar disc or not. In order to test the performance of the proposed method, the classification accuracy (%), sensitivity, specificity, precision, recall, f-measure, kappa value, and computation times have been used. The best hybrid model is the combination of k-means clustering based feature weighting and decision tree in the detecting of lumbar disc disease based on both sagittal and axial MR images.

Keywords: lumbar disc abnormality, lumbar MRI, lumbar spine, hybrid models, hybrid features, k-means clustering based feature weighting

Procedia PDF Downloads 499
3665 Starlink Satellite Collision Probability Simulation Based on Simplified Geometry Model

Authors: Toby Li, Julian Zhu

Abstract:

In this paper, a model based on a simplified geometry is introduced to give a very conservative collision probability prediction for the Starlink satellite in its most densely clustered region. Under the model in this paper, the probability of collision for Starlink satellite where it clustered most densely is found to be 8.484 ∗ 10^−4. It is found that the predicted collision probability increased nonlinearly with the increased safety distance set. This simple model provides evidence that the continuous development of maneuver avoidance systems is necessary for the future of the orbital safety of satellites under the harsher Lower Earth Orbit environment.

Keywords: Starlink, collision probability, debris, geometry model

Procedia PDF Downloads 52
3664 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 148
3663 Some Inequalities Related with Starlike Log-Harmonic Mappings

Authors: Melike Aydoğan, Dürdane Öztürk

Abstract:

Let H(D) be the linear space of all analytic functions defined on the open unit disc. A log-harmonic mappings is a solution of the nonlinear elliptic partial differential equation where w(z) ∈ H(D) is second dilatation such that |w(z)| < 1 for all z ∈ D. The aim of this paper is to define some inequalities of starlike logharmonic functions of order α(0 ≤ α ≤ 1).

Keywords: starlike log-harmonic functions, univalent functions, distortion theorem

Procedia PDF Downloads 499
3662 RAFU Functions in Robotics and Automation

Authors: Alicia C. Sanchez

Abstract:

This paper investigates the implementation of RAFU functions (radical functions) in robotics and automation. Specifically, the main goal is to show how these functions may be useful in lane-keeping control and the lateral control of autonomous machines, vehicles, robots or the like. From the knowledge of several points of a certain route, the RAFU functions are used to achieve the lateral control purpose and maintain the lane-keeping errors within the fixed limits. The stability that these functions provide, their ease of approaching any continuous trajectory and the control of the possible error made on the approximation may be useful in practice.

Keywords: automatic navigation control, lateral control, lane-keeping control, RAFU approximation

Procedia PDF Downloads 259
3661 An Approaching Index to Evaluate a forward Collision Probability

Authors: Yuan-Lin Chen

Abstract:

This paper presents an approaching forward collision probability index (AFCPI) for alerting and assisting driver in keeping safety distance to avoid the forward collision accident in highway driving. The time to collision (TTC) and time headway (TH) are used to evaluate the TTC forward collision probability index (TFCPI) and the TH forward collision probability index (HFCPI), respectively. The Mamdani fuzzy inference algorithm is presented combining TFCPI and HFCPI to calculate the approaching collision probability index of the vehicle. The AFCPI is easier to understand for the driver who did not even have any professional knowledge in vehicle professional field. At the same time, the driver’s behavior is taken into account for suiting each driver. For the approaching index, the value 0 is indicating the 0% probability of forward collision, and the values 0.5 and 1 are indicating the 50% and 100% probabilities of forward collision, respectively. The AFCPI is useful and easy-to-understand for alerting driver to avoid the forward collision accidents when driving in highway.

Keywords: approaching index, forward collision probability, time to collision, time headway

Procedia PDF Downloads 263
3660 VaR or TCE: Explaining the Preferences of Regulators

Authors: Silvia Faroni, Olivier Le Courtois, Krzysztof Ostaszewski

Abstract:

While a lot of research concentrates on the merits of VaR and TCE, which are the two most classic risk indicators used by financial institutions, little has been written on explaining why regulators favor the choice of VaR or TCE in their set of rules. In this paper, we investigate the preferences of regulators with the aim of understanding why, for instance, a VaR with a given confidence level is ultimately retained. Further, this paper provides equivalence rules that explain how a given choice of VaR can be equivalent to a given choice of TCE. Then, we introduce a new risk indicator that extends TCE by providing a more versatile weighting of the constituents of probability distribution tails. All of our results are illustrated using the generalized Pareto distribution.

Keywords: generalized pareto distribution, generalized tail conditional expectation, regulator preferences, risk measure

Procedia PDF Downloads 138
3659 Subclasses of Bi-Univalent Functions Associated with Hohlov Operator

Authors: Rashidah Omar, Suzeini Abdul Halim, Aini Janteng

Abstract:

The coefficients estimate problem for Taylor-Maclaurin series is still an open problem especially for a function in the subclass of bi-univalent functions. A function f ϵ A is said to be bi-univalent in the open unit disk D if both f and f-1 are univalent in D. The symbol A denotes the class of all analytic functions f in D and it is normalized by the conditions f(0) = f’(0) – 1=0. The class of bi-univalent is denoted by  The subordination concept is used in determining second and third Taylor-Maclaurin coefficients. The upper bound for second and third coefficients is estimated for functions in the subclasses of bi-univalent functions which are subordinated to the function φ. An analytic function f is subordinate to an analytic function g if there is an analytic function w defined on D with w(0) = 0 and |w(z)| < 1 satisfying f(z) = g[w(z)]. In this paper, two subclasses of bi-univalent functions associated with Hohlov operator are introduced. The bound for second and third coefficients of functions in these subclasses is determined using subordination. The findings would generalize the previous related works of several earlier authors.

Keywords: analytic functions, bi-univalent functions, Hohlov operator, subordination

Procedia PDF Downloads 269
3658 Analysis of Rural Roads in Developing Countries Using Principal Component Analysis and Simple Average Technique in the Development of a Road Safety Performance Index

Authors: Muhammad Tufail, Jawad Hussain, Hammad Hussain, Imran Hafeez, Naveed Ahmad

Abstract:

Road safety performance index is a composite index which combines various indicators of road safety into single number. Development of a road safety performance index using appropriate safety performance indicators is essential to enhance road safety. However, a road safety performance index in developing countries has not been given as much priority as needed. The primary objective of this research is to develop a general Road Safety Performance Index (RSPI) for developing countries based on the facility as well as behavior of road user. The secondary objectives include finding the critical inputs in the RSPI and finding the better method of making the index. In this study, the RSPI is developed by selecting four main safety performance indicators i.e., protective system (seat belt, helmet etc.), road (road width, signalized intersections, number of lanes, speed limit), number of pedestrians, and number of vehicles. Data on these four safety performance indicators were collected using observation survey on a 20 km road section of the National Highway N-125 road Taxila, Pakistan. For the development of this composite index, two methods are used: a) Principal Component Analysis (PCA) and b) Equal Weighting (EW) method. PCA is used for extraction, weighting, and linear aggregation of indicators to obtain a single value. An individual index score was calculated for each road section by multiplication of weights and standardized values of each safety performance indicator. However, Simple Average technique was used for weighting and linear aggregation of indicators to develop a RSPI. The road sections are ranked according to RSPI scores using both methods. The two weighting methods are compared, and the PCA method is found to be much more reliable than the Simple Average Technique.

Keywords: indicators, aggregation, principle component analysis, weighting, index score

Procedia PDF Downloads 125
3657 Determination of the Best Fit Probability Distribution for Annual Rainfall in Karkheh River at Iran

Authors: Karim Hamidi Machekposhti, Hossein Sedghi

Abstract:

This study was designed to find the best-fit probability distribution of annual rainfall based on 50 years sample (1966-2015) in the Karkheh river basin at Iran using six probability distributions: Normal, 2-Parameter Log Normal, 3-Parameter Log Normal, Pearson Type 3, Log Pearson Type 3 and Gumbel distribution. The best fit probability distribution was selected using Stormwater Management and Design Aid (SMADA) software and based on the Residual Sum of Squares (R.S.S) between observed and estimated values Based on the R.S.S values of fit tests, the Log Pearson Type 3 and then Pearson Type 3 distributions were found to be the best-fit probability distribution at the Jelogir Majin and Pole Zal rainfall gauging station. The annual values of expected rainfall were calculated using the best fit probability distributions and can be used by hydrologists and design engineers in future research at studied region and other region in the world.

Keywords: Log Pearson Type 3, SMADA, rainfall, Karkheh River

Procedia PDF Downloads 170
3656 A Case Study on the Numerical-Probability Approach for Deep Excavation Analysis

Authors: Komeil Valipourian

Abstract:

Urban advances and the growing need for developing infrastructures has increased the importance of deep excavations. In this study, after the introducing probability analysis as an important issue, an attempt has been made to apply it for the deep excavation project of Bangkok’s Metro as a case study. For this, the numerical probability model has been developed based on the Finite Difference Method and Monte Carlo sampling approach. The results indicate that disregarding the issue of probability in this project will result in an inappropriate design of the retaining structure. Therefore, probabilistic redesign of the support is proposed and carried out as one of the applications of probability analysis. A 50% reduction in the flexural strength of the structure increases the failure probability just by 8% in the allowable range and helps improve economic conditions, while maintaining mechanical efficiency. With regard to the lack of efficient design in most deep excavations, by considering geometrical and geotechnical variability, an attempt was made to develop an optimum practical design standard for deep excavations based on failure probability. On this basis, a practical relationship is presented for estimating the maximum allowable horizontal displacement, which can help improve design conditions without developing the probability analysis.

Keywords: numerical probability modeling, deep excavation, allowable maximum displacement, finite difference method (FDM)

Procedia PDF Downloads 100
3655 Sufficient Conditions for Exponential Stability of Stochastic Differential Equations with Non Trivial Solutions

Authors: Fakhreddin Abedi, Wah June Leong

Abstract:

Exponential stability of stochastic differential equations with non trivial solutions is provided in terms of Lyapunov functions. The main result of this paper establishes that, under certain hypotheses for the dynamics f(.) and g(.), practical exponential stability in probability at the small neighborhood of the origin is equivalent to the existence of an appropriate Lyapunov function. Indeed, we establish exponential stability of stochastic differential equation when almost all the state trajectories are bounded and approach a sufficiently small neighborhood of the origin. We derive sufficient conditions for exponential stability of stochastic differential equations. Finally, we give a numerical example illustrating our results.

Keywords: exponential stability in probability, stochastic differential equations, Lyapunov technique, Ito's formula

Procedia PDF Downloads 28
3654 Geometric Properties of Some q-Bessel Functions

Authors: İbrahim Aktaş, Árpád Baricz

Abstract:

In this paper, the radii of star likeness of the Jackson and Hahn-Exton q-Bessel functions are considered, and for each of them three different normalizations is applied. By applying Euler-Rayleigh inequalities for the first positive zeros of these functions tight lower, and upper bounds for the radii of starlikeness of these functions are obtained. The Laguerre-Pólya class of real entire functions plays an important role in this study. In particular, we obtain some new bounds for the first positive zero of the derivative of the classical Bessel function of the first kind.

Keywords: bessel function, lommel function, radius of starlikeness and convexity, Struve function

Procedia PDF Downloads 251
3653 Robust Noisy Speech Identification Using Frame Classifier Derived Features

Authors: Punnoose A. K.

Abstract:

This paper presents an approach for identifying noisy speech recording using a multi-layer perception (MLP) trained to predict phonemes from acoustic features. Characteristics of the MLP posteriors are explored for clean speech and noisy speech at the frame level. Appropriate density functions are used to fit the softmax probability of the clean and noisy speech. A function that takes into account the ratio of the softmax probability density of noisy speech to clean speech is formulated. These phoneme independent scoring is weighted using a phoneme-specific weightage to make the scoring more robust. Simple thresholding is used to identify the noisy speech recording from the clean speech recordings. The approach is benchmarked on standard databases, with a focus on precision.

Keywords: noisy speech identification, speech pre-processing, noise robustness, feature engineering

Procedia PDF Downloads 101
3652 Effect of Correlation of Random Variables on Structural Reliability Index

Authors: Agnieszka Dudzik

Abstract:

The problem of correlation between random variables in the structural reliability analysis has been extensively discussed in literature on the subject. The cases taken under consideration were usually related to correlation between random variables from one side of ultimate limit state: correlation between particular loads applied on structure or correlation between resistance of particular members of a structure as a system. It has been proved that positive correlation between these random variables reduces the reliability of structure and increases the probability of failure. In the paper, the problem of correlation between random variables from both side of the limit state equation will be taken under consideration. The simplest case where these random variables are of the normal distributions will be concerned. The case when a degree of that correlation is described by the covariance or the coefficient of correlation will be used. Special attention will be paid on questions: how much that correlation changes the reliability level and can it be ignored. In reliability analysis will be used well-known methods for assessment of the failure probability: based on the Hasofer-Lind reliability index and Monte Carlo method adapted to the problem of correlation. The main purpose of this work will be a presentation how correlation of random variables influence on reliability index of steel bar structures. Structural design parameters will be defined as deterministic values and random variables. The latter will be correlated. The criterion of structural failure will be expressed by limit functions related to the ultimate and serviceability limit state. In the description of random variables will be used only for the normal distribution. Sensitivity of reliability index to the random variables will be defined. If the reliability index sensitivity due to the random variable X will be low when compared with other variables, it can be stated that the impact of this variable on failure probability is small. Therefore, in successive computations, it can be treated as a deterministic parameter. Sensitivity analysis leads to simplify the description of the mathematical model, determine the new limit functions and values of the Hasofer-Lind reliability index. In the examples, the NUMPRESS software will be used in the reliability analysis.

Keywords: correlation of random variables, reliability index, sensitivity of reliability index, steel structure

Procedia PDF Downloads 209
3651 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies

Authors: Yuanjin Liu

Abstract:

Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.

Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model

Procedia PDF Downloads 44
3650 Seismic Loss Assessment for Peruvian University Buildings with Simulated Fragility Functions

Authors: Jose Ruiz, Jose Velasquez, Holger Lovon

Abstract:

Peruvian university buildings are critical structures for which very little research about its seismic vulnerability is available. This paper develops a probabilistic methodology that predicts seismic loss for university buildings with simulated fragility functions. Two university buildings located in the city of Cusco were analyzed. Fragility functions were developed considering seismic and structural parameters uncertainty. The fragility functions were generated with the Latin Hypercube technique, an improved Montecarlo-based method, which optimizes the sampling of structural parameters and provides at least 100 reliable samples for every level of seismic demand. Concrete compressive strength, maximum concrete strain and yield stress of the reinforcing steel were considered as the key structural parameters. The seismic demand is defined by synthetic records which are compatible with the elastic Peruvian design spectrum. Acceleration records are scaled based on the peak ground acceleration on rigid soil (PGA) which goes from 0.05g to 1.00g. A total of 2000 structural models were considered to account for both structural and seismic variability. These functions represent the overall building behavior because they give rational information regarding damage ratios for defined levels of seismic demand. The university buildings show an expected Mean Damage Factor of 8.80% and 19.05%, respectively, for the 0.22g-PGA scenario, which was amplified by the soil type coefficient and resulted in 0.26g-PGA. These ratios were computed considering a seismic demand related to 10% of probability of exceedance in 50 years which is a requirement in the Peruvian seismic code. These results show an acceptable seismic performance for both buildings.

Keywords: fragility functions, university buildings, loss assessment, Montecarlo simulation, latin hypercube

Procedia PDF Downloads 115
3649 Approximation of Analytic Functions of Several Variables by Linear K-Positive Operators in the Closed Domain

Authors: Tulin Coskun

Abstract:

We investigate the approximation of analytic functions of several variables in polydisc by the sequences of linear k-positive operators in Gadjiev sence. The approximation of analytic functions of complex variable by linear k-positive operators was tackled, and k-positive operators and formulated theorems of Korovkin's type for these operators in the space of analytic functions on the unit disc were introduced in the past. Recently, very general results on convergence of the sequences of linear k-positive operators on a simply connected bounded domain within the space of analytic functions were proved. In this presentation, we extend some of these results to the approximation of analytic functions of several complex variables by sequences of linear k-positive operators.

Keywords: analytic functions, approximation of analytic functions, Linear k-positive operators, Korovkin type theorems

Procedia PDF Downloads 318
3648 The Probability Foundation of Fundamental Theoretical Physics

Authors: Quznetsov Gunn

Abstract:

In the study of the logical foundations of probability theory, it was found that the terms and equations of the fundamental theoretical physics represent terms and theorems of the classical probability theory, more precisely, of that part of this theory, which considers the probability of dot events in the 3 + 1 space-time. In particular, the masses, moments, energies, spins, etc. turn out of parameters of probability distributions such events. The terms and the equations of the electroweak and of the quark-gluon theories turn out the theoretical-probabilistic terms and theorems. Here the relation of a neutrino to his lepton becomes clear, the W and Z bosons masses turn out dynamic ones, the cause of the asymmetry between particles and antiparticles is the impossibility of the birth of single antiparticles. In addition, phenomena such as confinement and asymptotic freedom receive their probabilistic explanation. And here we have the logical foundations of the gravity theory with phenomena dark energy and dark matter.

Keywords: classical theory of probability, logical foundation of fundamental theoretical physics, masses, moments, energies, spins

Procedia PDF Downloads 275
3647 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'

Authors: Anthony Coogan

Abstract:

Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.

Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle

Procedia PDF Downloads 173
3646 A Practical and Efficient Evaluation Function for 3D Model Based Vehicle Matching

Authors: Yuan Zheng

Abstract:

3D model-based vehicle matching provides a new way for vehicle recognition, localization and tracking. Its key is to construct an evaluation function, also called fitness function, to measure the degree of vehicle matching. The existing fitness functions often poorly perform when the clutter and occlusion exist in traffic scenarios. In this paper, we present a practical and efficient fitness function. Unlike the existing evaluation functions, the proposed fitness function is to study the vehicle matching problem from both local and global perspectives, which exploits the pixel gradient information as well as the silhouette information. In view of the discrepancy between 3D vehicle model and real vehicle, a weighting strategy is introduced to differently treat the fitting of the model’s wireframes. Additionally, a normalization operation for the model’s projection is performed to improve the accuracy of the matching. Experimental results on real traffic videos reveal that the proposed fitness function is efficient and robust to the cluttered background and partial occlusion.

Keywords: 3D-2D matching, fitness function, 3D vehicle model, local image gradient, silhouette information

Procedia PDF Downloads 367
3645 A Probability Analysis of Construction Project Schedule Using Risk Management Tool

Authors: A. L. Agarwal, D. A. Mahajan

Abstract:

Construction industry tumbled along with other industry/sectors during recent economic crash. Construction business could not regain thereafter and still pass through slowdown phase, resulted many real estate as well as infrastructure projects not completed on schedule and within budget. There are many theories, tools, techniques with software packages available in the market to analyze construction schedule. This study focuses on the construction project schedule and uncertainties associated with construction activities. The infrastructure construction project has been considered for the analysis of uncertainty on project activities affecting project duration and analysis is done using @RISK software. Different simulation results arising from three probability distribution functions are compiled to benefit construction project managers to plan more realistic schedule of various construction activities as well as project completion to document in the contract and avoid compensations or claims arising out of missing the planned schedule.

Keywords: construction project, distributions, project schedule, uncertainty

Procedia PDF Downloads 312
3644 Unconventional Calculus Spreadsheet Functions

Authors: Chahid K. Ghaddar

Abstract:

The spreadsheet engine is exploited via a non-conventional mechanism to enable novel worksheet solver functions for computational calculus. The solver functions bypass inherent restrictions on built-in math and user defined functions by taking variable formulas as a new type of argument while retaining purity and recursion properties. The enabling mechanism permits integration of numerical algorithms into worksheet functions for solving virtually any computational problem that can be modelled by formulas and variables. Several examples are presented for computing integrals, derivatives, and systems of deferential-algebraic equations. Incorporation of the worksheet solver functions with the ubiquitous spreadsheet extend the utility of the latter as a powerful tool for computational mathematics.

Keywords: calculus, differential algebraic equations, solvers, spreadsheet

Procedia PDF Downloads 321
3643 Conflation Methodology Applied to Flood Recovery

Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: community resilience, conflation, flood risk, nuisance flooding

Procedia PDF Downloads 67
3642 Presenting a Model in the Analysis of Supply Chain Management Components by Using Statistical Distribution Functions

Authors: Ramin Rostamkhani, Thurasamy Ramayah

Abstract:

One of the most important topics of today’s industrial organizations is the challenging issue of supply chain management. In this field, scientists and researchers have published numerous practical articles and models, especially in the last decade. In this research, to our best knowledge, the discussion of data modeling of supply chain management components using well-known statistical distribution functions has been considered. The world of science owns mathematics, and showing the behavior of supply chain data based on the characteristics of statistical distribution functions is innovative research that has not been published anywhere until the moment of doing this research. In an analytical process, describing different aspects of functions including probability density, cumulative distribution, reliability, and failure function can reach the suitable statistical distribution function for each of the components of the supply chain management. It can be applied to predict the behavior data of the relevant component in the future. Providing a model to adapt the best statistical distribution function in the supply chain management components will be a big revolution in the field of the behavior of the supply chain management elements in today's industrial organizations. Demonstrating the final results of the proposed model by introducing the process capability indices before and after implementing it alongside verifying the approach through the relevant assessment as an acceptable verification is a final step. The introduced approach can save the required time and cost to achieve the organizational goals. Moreover, it can increase added value in the organization.

Keywords: analyzing, process capability indices, statistical distribution functions, supply chain management components

Procedia PDF Downloads 66
3641 COVID-19 Teaches Probability Risk Assessment

Authors: Sean Sloan

Abstract:

Probability Risk Assessments (PRA) can be a difficult concept for students to grasp. So in searching for different ways to describe PRA to relate it to their lives; COVID-19 came up. The parallels are amazing. Soon students began analyzing acceptable risk with the virus. This helped them to quantify just how dangerous is dangerous. The original lesson was dismissed and for the remainder of the period, the probability of risk, and the lethality of risk became the topic. Spreading events such as a COVID carrier on an airline became analogous to single fault casualties such as a Tsunami. Odds of spreading became odds of backup-diesel-generator failure – like with Fukashima Daiichi. Fatalities of the disease became expected fatalities due to radiation spread. Quantification from this discussion took it from hyperbole and emotion into one where we could rationally base guidelines. It has been one of the most effective educational devices observed.

Keywords: COVID, education, probability, risk

Procedia PDF Downloads 134
3640 A Hazard Rate Function for the Time of Ruin

Authors: Sule Sahin, Basak Bulut Karageyik

Abstract:

This paper introduces a hazard rate function for the time of ruin to calculate the conditional probability of ruin for very small intervals. We call this function the force of ruin (FoR). We obtain the expected time of ruin and conditional expected time of ruin from the exact finite time ruin probability with exponential claim amounts. Then we introduce the FoR which gives the conditional probability of ruin and the condition is that ruin has not occurred at time t. We analyse the behavior of the FoR function for different initial surpluses over a specific time interval. We also obtain FoR under the excess of loss reinsurance arrangement and examine the effect of reinsurance on the FoR.

Keywords: conditional time of ruin, finite time ruin probability, force of ruin, reinsurance

Procedia PDF Downloads 363