Stability (probability)

In probability theory, the stability of a random variable is the property that a linear combination of two independent copies of the variable has the same distribution, up to location and scale parameters.[1] The distributions of random variables having this property are said to be "stable distributions". Results available in probability theory show that all possible distributions having this property are members of a four-parameter family of distributions. The article on the stable distribution describes this family together with some of the properties of these distributions.

The importance in probability theory of "stability" and of the stable family of probability distributions is that they are "attractors" for properly normed sums of independent and identically distributed random variables.

Important special cases of stable distributions are the normal distribution, the Cauchy distribution and the Lévy distribution. For details see stable distribution.

Definition

edit

There are several basic definitions for what is meant by stability. Some are based on summations of random variables and others on properties of characteristic functions.

Definition via distribution functions

edit

Feller[2] makes the following basic definition. A random variable X is called stable (has a stable distribution) if, for n independent copies Xi of X, there exist constants cn > 0 and dn such that

 

where this equality refers to equality of distributions. A conclusion drawn from this starting point is that the sequence of constants cn must be of the form

   for   

A further conclusion is that it is enough for the above distributional identity to hold for n=2 and n=3 only.[3]

Stability in probability theory

edit

There are a number of mathematical results that can be derived for distributions which have the stability property. That is, all possible families of distributions which have the property of being closed under convolution are being considered.[4] It is convenient here to call these stable distributions, without meaning specifically the distribution described in the article named stable distribution, or to say that a distribution is stable if it is assumed that it has the stability property. The following results can be obtained for univariate distributions which are stable.

Other types of stability

edit

The above concept of stability is based on the idea of a class of distributions being closed under a given set of operations on random variables, where the operation is "summation" or "averaging". Other operations that have been considered include:

See also

edit

Notes

edit
  1. ^ Lukacs, E. (1970) Section 5.7
  2. ^ Feller (1971), Section VI.1
  3. ^ Feller (1971), Problem VI.13.3
  4. ^ Lukacs, E. (1970) Section 5.7
  5. ^ Lukacs, E. (1970) Theorem 5.7.1
  6. ^ Lukacs, E. (1970) Theorem 5.8.1
  7. ^ Lukacs, E. (1970) Theorem 5.10.1
  8. ^ Klebanov et al. (1984)

References

edit
  • Lukacs, E. (1970) Characteristic Functions. Griffin, London.
  • Feller, W. (1971) An Introduction to Probability Theory and Its Applications, Volume 2. Wiley. ISBN 0-471-25709-5
  • Klebanov, L.B., Maniya, G.M., Melamed, I.A. (1984) "A problem of V. M. Zolotarev and analogues of infinitely divisible and stable distributions in a scheme for summation of a random number of random variables". Theory Probab. Appl., 29, 791–794