2020 Mathematics Subject Classification: Primary: 60J10 Secondary: 60J27 [MSN][ZBL]
A probability distribution for a homogeneous Markov chain that is independent of time. Let $ \xi ( t) $ be a homogeneous Markov chain with set of states $ S $ and transition probabilities $ p _ {ij} ( t) = {\mathsf P} \{ \xi ( t) = j \mid \xi ( 0) = i \} $. A stationary distribution is a set of numbers $ \{ {\pi _ {j} } : {j \in S } \} $ such that
$$ \tag{1 } \pi _ {j} \geq 0 ,\ \sum _ {j \in S } \pi _ {j} = 1, $$
$$ \tag{2 } \sum _ {i \in S } \pi _ {i} p _ {ij} ( t) = \pi _ {j} ,\ j \in S ,\ t > 0. $$
The equalities (2) signify that a stationary distribution is invariant in time: If $ {\mathsf P} \{ \xi ( 0) = i \} = \pi _ {i} $, $ i \in S $, then $ {\mathsf P} \{ \xi ( t) = i \} = \pi _ {i} $ for any $ i \in S $, $ t > 0 $; moreover, for any $ t, t _ {1} \dots t _ {k} > 0 $, $ i _ {1} \dots i _ {k} \in S $,
$$ {\mathsf P} \{ \xi ( t _ {1} + t) = i _ {1} \dots \xi ( t _ {k} + t) = i _ {k} \} = $$
$$ = \ {\mathsf P} \{ \xi ( t _ {1} ) = i _ {1} \dots \xi ( t _ {k} ) = i _ {k} \} . $$
If $ i \in S $ is a state of the Markov chain $ \xi ( t) $ for which the limits
$$ \lim\limits _ {t \rightarrow \infty } p _ {ij} ( t) = \pi _ {j} ( i) \geq 0,\ \ j \in S ,\ \ \sum _ {j \in S } \pi _ {j} ( i) = 1 , $$
exist, then the set of numbers $ \{ {\pi _ {j} ( i) } : {j \in S } \} $ satisfies (2) and is a stationary distribution of the chain $ \xi ( t) $( see also Transition probabilities).
The system of linear equations (2) relative to $ \{ \pi _ {j} \} $, given the supplementary conditions (1), has a unique solution if the number of classes of positive states of the Markov chain $ \xi ( t) $ is equal to 1; if the chain has $ k $ classes of positive states, then the set of its stationary distributions is the convex hull of $ k $ stationary distributions, each of which is concentrated on one class (see Markov chain, class of positive states of a).
Any non-negative solution of the system (2) is called a stationary measure; a stationary measure can exist also when (1) and (2) are not compatible. For example, a random walk on $ \{ 0, 1 ,\dots \} $:
$$ \xi ( 0) = 0,\ \ \xi ( t) = \xi ( t- 1) + \eta ( t),\ \ t = 1, 2 \dots $$
where $ \eta ( 1) , \eta ( 2) \dots $ are independent random variables such that
$$ {\mathsf P} \{ \eta ( i) = 1 \} = p,\ \ {\mathsf P} \{ \eta ( i) = - 1 \} = 1- p,\ \ 0 < p < 1, $$
$$ i = 1, 2 \dots $$
does not have a stationary distribution, but has a stationary measure:
$$ \pi _ {j} = \left ( \frac{p}{1-} p \right ) ^ {j} ,\ \ j = 0, \pm 1 ,\dots . $$
One of the possible probabilistic interpretations of a stationary measure $ \{ \pi _ {j} \} $ of a Markov chain $ \xi ( t) $ with set of states $ S $ is as follows. Let there be a countable set of independent realizations of $ \xi ( t) $, and let $ \eta _ {t} ( i) $ be the number of realizations for which $ \xi ( t) = i $. If the random variables $ \eta _ {0} ( i) $, $ i \in S $, are independent and are subject to Poisson distributions with respective means $ \pi _ {i} $, $ i \in S $, then for any $ t > 0 $ the random variables $ \eta _ {t} ( i) $, $ i \in S $, are independent and have the same distributions as $ \eta _ {0} ( i) $, $ i \in S $.
[C] | K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1960) MR0116388 Zbl 0092.34304 |
[K] | S. Karlin, "A first course in stochastic processes" , Acad. Press (1966) MR0208657 Zbl 0315.60016 Zbl 0226.60052 Zbl 0177.21102 |
Stationary distributions are also defined for more general Markov processes, see e.g. [B].
[B] | L.P. Breiman, "Probability" , Addison-Wesley (1968) MR0229267 Zbl 0174.48801 |