Let P = 0.50.30.50.7 be the transition matrix for a Markov chain with two states. Let x0 = 0.50.5 be the initial state vector for the population.Find the steady state vector x. (Give the steady state vector as a probability vector.)

Respuesta :

Answer:

Probability distribution vector = [TeX]\left(\begin{array}{c}0.375\\ 0.625 \end{array} \right) [/TeX]

Step-By-Step Explanation

If [TeX]P=\left(\begin{array}{cc}0.5&0.3\\ 0.5&0.7 \end{array} \right) [/TeX]  is the transition matrix for a Markov chain with two states.  

[TeX]x_{0}=\left(\begin{array}{c}0.5\\ 0.5 \end{array} \right) [/TeX]  be the initial state vector for the population.

[TeX]X_{1}=P x_{0}=\left(\begin{array}{cc}0.5&0.3\\ 0.5&0.7 \end{array} \right) \left(\begin{array}{c}0.5\\ 0.5 \end{array} \right) =\left(\begin{array}{c}0.4\\ 0.6 \end{array} \right) [/TeX]  

[TeX] X_{2}=P^{2} x_{0}=\left(\begin{array}{c}0.38\\ 0.62 \end{array} \right) [/TeX]  

[TeX] X_{3}=P^{3} x_{0}=\left(\begin{array}{c}0.38\\ 0.62 \end{array} \right) [/TeX]  

[TeX] X_{30}=P^{30} x_{0}=\left(\begin{array}{c}0.37499\\ 0.625 \end{array} \right) [/TeX]  

In the long run, the probability distribution vector Xm approaches the probability distribution vector [TeX]\left(\begin{array}{c}0.375\\ 0.625 \end{array} \right) [/TeX] .

This is called the steady-state (or limiting,) distribution vector.