Definition A2: Any row vector = [q(i)] ,i e E satisfying the conditions
(1) Vi eE:q(i)>0
(2) qT= q(i)= 1
ie E
is called a probability vector.
Definition A3: Any square matrix P whose rows are all composed of probability vectors
is called a stochastic matrix, or sometimes more explicitly a row stochastic matrix. The
row sum constraint on a row stochastic matrix can be written as P1 = 1, so 1 is an eigen-
value of every stochastic matrix.
Definition A4: The stochastic matrix
P= [Pk(i,)] = [Pr{Xk =J I Xk = i]
is the one step transition probability matrix or state transition matrix of the Markov chain
X. If the probability vectors qk and qk+1 are respectively the probability distributions of
Xk and Xk+i, then
Similarly, the stochastic matrix
k-1
Pmk = [Pmk(i,)] = [Pr{Xk=j I X, = i}] =mPPI... k-1= n P
I=m
where k = m + n, m,n E K, n > 0 is the n-step transition probability matrix of X, and
k-1
qk = qmPk =qm In'P,
I=m
Definition A5: Let Pk be the state transition matrix of the Markov chain X at time (se-
quence index) k. Then, X is time-homogeneous if and only if Vk e K it follows that
Pk = P where P is a constant state transition matrix.