In statistics, an exchangeable sequence of random variables (also sometimes interchangeable)[1] is a sequence X1, X2, X3, ... (which may be finitely or infinitely long) whose joint probability distribution does not change when the positions in the sequence in which finitely many of them appear are altered. In other words, the joint distribution is invariant to finite permutation. Thus, for example the sequences
both have the same joint probability distribution.
It is closely related to the use of independent and identically distributed random variables in statistical models. Exchangeable sequences of random variables arise in cases of simple random sampling.
Formally, an exchangeable sequence of random variables is a finite or infinite sequence X1, X2, X3, ... of random variables such that for any finite permutation σ of the indices 1, 2, 3, ..., (the permutation acts on only finitely many indices, with the rest fixed), the joint probability distribution of the permuted sequence
is the same as the joint probability distribution of the original sequence.[1][2]
(A sequence E1, E2, E3, ... of events is said to be exchangeable precisely if the sequence of its indicator functions is exchangeable.) The distribution function FX1,...,Xn(x1, ..., xn) of a finite sequence of exchangeable random variables is symmetric in its arguments x1, ..., xn. Olav Kallenberg provided an appropriate definition of exchangeability for continuous-time stochastic processes.[3][4]
The concept was introduced by William Ernest Johnson in his 1924 book Logic, Part III: The Logical Foundations of Science.[5] Exchangeability is equivalent to the concept of statistical control introduced by Walter Shewhart also in 1924.[6][7]
The property of exchangeability is closely related to the use of independent and identically distributed (i.i.d.) random variables in statistical models. A sequence of random variables that are i.i.d, conditional on some underlying distributional form, is exchangeable. This follows directly from the structure of the joint probability distribution generated by the i.i.d. form.
Mixtures of exchangeable sequences (in particular, sequences of i.i.d. variables) are exchangeable. The converse can be established for infinite sequences, through an important representation theorem by Bruno de Finetti (later extended by other probability theorists such as Halmos and Savage). The extended versions of the theorem show that in any infinite sequence of exchangeable random variables, the random variables are conditionally independent and identically-distributed, given the underlying distributional form. This theorem is stated briefly below. (De Finetti's original theorem only showed this to be true for random indicator variables, but this was later extended to encompass all sequences of random variables.) Another way of putting this is that de Finetti's theorem characterizes exchangeable sequences as mixtures of i.i.d. sequences — while an exchangeable sequence need not itself be unconditionally i.i.d., it can be expressed as a mixture of underlying i.i.d. sequences.[1]
This means that infinite sequences of exchangeable random variables can be regarded equivalently as sequences of conditionally i.i.d. random variables, based on some underlying distributional form. (Note that this equivalence does not quite hold for finite exchangeability. However, for finite vectors of random variables there is a close approximation to the i.i.d. model.) An infinite exchangeable sequence is strictly stationary and so a law of large numbers in the form of Birkhoff–Khinchin theorem applies.[4] This means that the underlying distribution can be given an operational interpretation as the limiting empirical distribution of the sequence of values. The close relationship between exchangeable sequences of random variables and the i.i.d. form means that the latter can be justified on the basis of infinite exchangeability. This notion is central to Bruno de Finetti's development of predictive inference and to Bayesian statistics. It can also be shown to be a useful foundational assumption in frequentist statistics and to link the two paradigms.[8]
The representation theorem: This statement is based on the presentation in O'Neill (2009) in references below. Given an infinite sequence of random variables [math]\displaystyle{ \mathbf{X}=(X_1,X_2,X_3,\ldots) }[/math] we define the limiting empirical distribution function [math]\displaystyle{ F_\mathbf{X} }[/math] by
(This is the Cesàro limit of the indicator functions. In cases where the Cesàro limit does not exist this function can actually be defined as the Banach limit of the indicator functions, which is an extension of this limit. This latter limit always exists for sums of indicator functions, so that the empirical distribution is always well-defined.) This means that for any vector of random variables in the sequence we have joint distribution function given by
If the distribution function [math]\displaystyle{ F_\mathbf{X} }[/math] is indexed by another parameter [math]\displaystyle{ \theta }[/math] then (with densities appropriately defined) we have
These equations show the joint distribution or density characterised as a mixture distribution based on the underlying limiting empirical distribution (or a parameter indexing this distribution).
Note that not all finite exchangeable sequences are mixtures of i.i.d. To see this, consider sampling without replacement from a finite set until no elements are left. The resulting sequence is exchangeable, but not a mixture of i.i.d. Indeed, conditioned on all other elements in the sequence, the remaining element is known.
Exchangeable sequences have some basic covariance and correlation properties which mean that they are generally positively correlated. For infinite sequences of exchangeable random variables, the covariance between the random variables is equal to the variance of the mean of the underlying distribution function.[8] For finite exchangeable sequences the covariance is also a fixed value which does not depend on the particular random variables in the sequence. There is a weaker lower bound than for infinite exchangeability and it is possible for negative correlation to exist.
Covariance for exchangeable sequences (infinite): If the sequence [math]\displaystyle{ X_1,X_2,X_3,\ldots }[/math] is exchangeable, then
Covariance for exchangeable sequences (finite): If [math]\displaystyle{ X_1,X_2,\ldots,X_n }[/math] is exchangeable with [math]\displaystyle{ \sigma^2 = \operatorname{var} (X_i) }[/math], then
The finite sequence result may be proved as follows. Using the fact that the values are exchangeable, we have
We can then solve the inequality for the covariance yielding the stated lower bound. The non-negativity of the covariance for the infinite sequence can then be obtained as a limiting result from this finite sequence result.
Equality of the lower bound for finite sequences is achieved in a simple urn model: An urn contains 1 red marble and n − 1 green marbles, and these are sampled without replacement until the urn is empty. Let Xi = 1 if the red marble is drawn on the i-th trial and 0 otherwise. A finite sequence that achieves the lower covariance bound cannot be extended to a longer exchangeable sequence.[9]
The von Neumann extractor is a randomness extractor that depends on exchangeability: it gives a method to take an exchangeable sequence of 0s and 1s (Bernoulli trials), with some probability p of 0 and [math]\displaystyle{ q=1-p }[/math] of 1, and produce a (shorter) exchangeable sequence of 0s and 1s with probability 1/2.
Partition the sequence into non-overlapping pairs: if the two elements of the pair are equal (00 or 11), discard it; if the two elements of the pair are unequal (01 or 10), keep the first. This yields a sequence of Bernoulli trials with [math]\displaystyle{ p=1/2, }[/math] as, by exchangeability, the odds of a given pair being 01 or 10 are equal.
Exchangeable random variables arise in the study of U statistics, particularly in the Hoeffding decomposition.[11]
Exchangeability is a key assumption of the distribution-free inference method of conformal prediction.
Original source: https://en.wikipedia.org/wiki/Exchangeable random variables.
Read more |