Since it is independent of the sequence index (i.e. the parameter vectors are constant), the
one-operator Markov chain is time-homogeneous (Definition A5).
The set of states which represent uniform populations (i.e. the states mA e SA' c S'
in which one component is M and all others are zero) are absorbing states of the Markov
chain, because for any such state, P(mA I mA) = 1 and Definition A6 applies. Since it fol-
lows from Eq. 4.2-3 that Vn e S' SA' P1(n I n) < 1, there are exactly N = 2L absorbing
states. The corresponding rows of P are given by
1 m=nA
VnA SA :P,( nA)= Eq.4.7
0 MES {nA
Thus, for each state nA e SA', the associated row of the state transition matrix (Eq. 4.6)
contains 1 in the principal diagonal location and 0 elsewhere. It follows that the N' x 1
probability vector q,A (Definition A2) whose nA e SA' component is 1 is a stationary dis-
tribution (Definition A10) of the one-operator Markov chain. It is not unique because any
of the N = 2L such vectors satisfies the requirement, as does any vector of the form
q= W ~l ~Awhere W2 0 andY = 1.
HA E SA
The absorbing states preclude irreducibility (Theorem Al), so the Markov chain
does not satisfy the requirements of Theorem A3. The chain is periodic (Definition A9)
however, because Vm e S' : P,(m I m) > 0 so the period of all states is 1. Thus, all of the
conditions of Theorem A3 except irreducibility are met by the one-operator Markov
chain.
The expected number of transitions required to arrive in an absorbing state, E{kA},
is finite. An upper bound on E{kA} is given by
Mx < oo Eq. 4.82M
E{kA Ix Rf J < oo Eq. 4.8
I in )