Probability bounds analysis


Probability bounds analysis is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.
This bounding approach permits analysts to make calculations without requiring overly precise assumptions about parameter values, dependence among variables, or even distribution shape. Probability bounds analysis is essentially a combination of the methods of standard interval analysis and classical probability theory. Probability bounds analysis gives the same answer as interval analysis does when only range information is available. It also gives the same answers as Monte Carlo simulation does when information is abundant enough to precisely specify input distributions and their dependencies. Thus, it is a generalization of both interval analysis and probability theory.
The diverse methods comprising probability bounds analysis provide algorithms to evaluate mathematical expressions when there is uncertainty about the input values, their dependencies, or even the form of mathematical expression itself. The calculations yield results that are guaranteed to enclose all possible distributions of the output variable if the input p-boxes were also sure to enclose their respective distributions. In some cases, a calculated p-box will also be best-possible in the sense that
the bounds could be no tighter without excluding some of the possible
distributions.
P-boxes are usually merely bounds on possible distributions. The bounds often also enclose distributions that are not themselves possible. For instance, the set of probability distributions that could result from adding random values without the independence assumption from two distributions is generally a proper subset of all the distributions enclosed by the p-box computed for the sum. That is, there are distributions within the output p-box that could not arise under any dependence between the two input distributions. The output p-box will, however, always contain all distributions that are possible, so long as the input p-boxes were sure to enclose their respective underlying distributions. This property often suffices for use in risk analysis and other fields requiring calculations under uncertainty.

History of bounding probability

The idea of bounding probability has a very long
tradition throughout the history of probability theory. Indeed, in 1854 George Boole used the notion of interval bounds on probability in his The Laws of Thought. Also dating from the latter half of the 19th century, the inequality attributed to Chebyshev described bounds on a distribution when only the mean and
variance of the variable are known, and the related inequality attributed to Markov found bounds on a
positive variable when only the mean is known.
Kyburg reviewed the history
of interval probabilities and traced the development of the critical ideas through the 20th century, including the important notion of incomparable probabilities favored by Keynes.
Of particular note is Fréchet's derivation in the 1930s of bounds on calculations involving total probabilities without
dependence assumptions. Bounding probabilities has continued to the
present day
The methods of probability bounds analysis that could be routinely used in
risk assessments were developed in the 1980s. Hailperin described a computational scheme for bounding logical calculations extending the ideas of Boole. Yager described the elementary procedures by which bounds on convolutions can be computed under an assumption of independence. At about the same time, Makarov, and independently, Rüschendorf solved the problem, originally posed by Kolmogorov, of how to find the upper and lower bounds for the probability distribution of a sum of random variables whose marginal distributions, but not their joint distribution, are known. Frank et al. generalized the result of Makarov and expressed it in terms of copulas. Since that time, formulas and algorithms for sums have been generalized and extended to differences, products, quotients and other binary and unary functions under various dependence assumptions.

Arithmetic expressions

Arithmetic expressions involving operations such as additions, subtractions, multiplications, divisions, minima, maxima, powers, exponentials, logarithms, square roots, absolute values, etc., are commonly used in risk analyses and uncertainty modeling. Convolution is the operation of finding the probability distribution of a sum of independent random variables specified by probability distributions. We can extend the term to finding distributions of other mathematical functions and other assumptions about the intervariable dependencies. There are convenient algorithms for computing these generalized convolutions under a variety of assumptions about the dependencies among the inputs.

Mathematical details

Let denote the space of distribution functions on the real numbers i.e.,
A p-box is a quintuple
where are real intervals, and This quintuple denotes the set of distribution functions such that:
If a function satisfies all the conditions above it is said to be inside the p-box. In some cases, there may be no information about the moments or distribution family other than what is encoded in the two distribution functions that constitute the edges of the p-box. Then the quintuple representing the p-box can be denoted more compactly as . This notation harkens to that of intervals on the real line, except that the endpoints are distributions rather than points.
The notation denotes the fact that is a random variable governed by the distribution function F, that is,
Let us generalize the tilde notation for use with p-boxes. We will write X ~ B to mean that X is a random variable whose distribution function is unknown except that it is inside B. Thus, X ~ FB can be contracted to X ~ B without mentioning the distribution function explicitly.
If X and Y are independent random variables with distributions F and G respectively, then X + Y = Z ~ H given by
This operation is called a convolution on F and G. The analogous operation on p-boxes is straightforward for sums. Suppose
If X and Y are stochastically independent, then the distribution of Z = X + Y is inside the p-box
Finding bounds on the distribution of sums Z = X + Y without making any assumption about the dependence between X and Y is actually easier than the problem assuming independence. Makarov showed that
These bounds are implied by the Fréchet–Hoeffding copula bounds. The problem can also be solved using the methods of mathematical programming.
The convolution under the intermediate assumption that X and Y have positive dependence is likewise easy to compute, as is the convolution under the extreme assumptions of perfect positive or perfect negative dependency between X and Y.
Generalized convolutions for other operations such as subtraction, multiplication, division, etc., can be derived using transformations. For instance, p-box subtraction AB can be defined as A +, where the negative of a p-box B = is .

Logical expressions

Logical or Boolean expressions involving conjunctions, disjunctions, exclusive disjunctions, equivalences, conditionals, etc. arise in the analysis of fault trees and event trees common in risk assessments. If the probabilities of events are characterized by intervals, as suggested by Boole and Keynes among others, these binary operations are straightforward to evaluate. For example, if the probability of an event A is in the interval P = a = , and the probability of the event B is in P = b = , then the probability of the conjunction is surely in the interval
so long as A and B can be assumed to be independent events. If they are not independent, we can still bound the conjunction using the classical Fréchet inequality. In this case, we can infer at least that the probability of the joint event A & B is surely within the interval
where env is . Likewise, the probability of the disjunction is surely in the interval
if A and B are independent events. If they are not independent, the Fréchet inequality bounds the disjunction
It is also possible to compute interval bounds on the conjunction or disjunction under other assumptions about the dependence between A and B. For instance, one might assume they are positively dependent, in which case the resulting interval is not as tight as the answer assuming independence but tighter than the answer given by the Fréchet inequality. Comparable calculations are used for other logical functions such as negation, exclusive disjunction, etc. When the Boolean expression to be evaluated becomes complex, it may be necessary to evaluate it using the methods of mathematical programming to get best-possible bounds on the expression. A similar problem one presents in the case of probabilistic logic. If the probabilities of the events are characterized by probability distributions or p-boxes rather than intervals, then analogous calculations can be done to obtain distributional or p-box results characterizing the probability of the top event.

Magnitude comparisons

The probability that an uncertain number represented by a p-box D is less than zero is the interval Pr = , where is the left bound of the probability box D and F is its right bound, both evaluated at zero. Two uncertain numbers represented by probability boxes may then be compared for numerical magnitude with the following encodings:
Thus the probability that A is less than B is the same as the probability that their difference is less than zero, and this probability can be said to be the value of the expression A < B.
Like arithmetic and logical operations, these magnitude comparisons generally depend on the stochastic dependence between A and B, and the subtraction in the encoding should reflect that dependence. If their dependence is unknown, the difference can be computed without making any assumption using the Fréchet operation.

Sampling-based computation

Some analysts use sampling-based approaches to computing probability bounds, including Monte Carlo simulation, Latin hypercube methods or importance sampling. These approaches cannot assure mathematical rigor in the result because such simulation methods are approximations, although their performance can generally be improved simply by increasing the number of replications in the simulation. Thus, unlike the analytical theorems or methods based on mathematical programming, sampling-based calculations usually cannot produce verified computations. However, sampling-based methods can be very useful in addressing a variety of problems which are computationally difficult to solve analytically or even to rigorously bound. One important example is the use of Cauchy-deviate sampling to avoid the curse of dimensionality in propagating interval uncertainty through high-dimensional problems.

Relationship to other uncertainty propagation approaches

PBA belongs to a class of methods that use imprecise probabilities to simultaneously represent aleatoric and epistemic uncertainties. PBA is a generalization of both interval analysis and probabilistic convolution such as is commonly implemented with Monte Carlo simulation. PBA is also closely related to robust Bayes analysis, which is sometimes called Bayesian sensitivity analysis. PBA is an alternative to second-order Monte Carlo simulation.

Applications

Further references

*