Consider the Markov chain whose transition probability matrix is given by starting at state x₀ = 1. Determine the probability that the process never visits state 2.
a) P(X₁ = 2 | X₀ = 1)
b) P(X₂ = 2 | X₀ = 1)
c) P(Never visit 2 | X₀ = 1)
d) P(Always visit 2 | X₀ = 1)