Transient Analysis of Markov Chains Two ways to compute time to reach a state ========================================= * (I-Q)^(-1) - given exponential r.v. X - E[X]=1/(1-p) -- show 2 ways to prove it! -- first way) expression of P(X=x) -- second way) recursive - present (I-Q)^(-1) as generalization - I+Q+Q^2+Q^3+... * modify Markov Chain - consider sample path & number of visits to state - show (using example) that normalizing the number of visits in the modified chain we obtain the number of visits to each state before absorption Discuss (I-Q)^1 * R =================== - probability of being absorbed at given state Problems to be handled ====================== - cycles (add self loops) - sinks (add transitions to every other state) State Classification ==================== - Recurrent (null & positive) - Aperiodic Ergodic (positive recurrent + aperiodic)