Markov Chains

due by Nov 12, 2018

Fix $r\in (0, 1)$. Consider the Markov chain $\{X_n\}$ with state space $S = \{n \in \mathbb{Z}, n \geq 0 \}$ and with transition probabilities $p_{ij} = P(X_{n+1} = j\vert X_n = i)$ given by
  1. Show that the Markov chain $\{X_n\}$ is irreducible and aperiodic.
  2. Consider the case when $r < \frac{1}{2}$.
    1. Find a stationary distribution for the Markov chain.
    2. Conclude that every state is positively recurrent.
  3. For $i \geq 1$, let $P_i$ be the event that $X_i - X_{i-1} = 1$. Show that the events $P_i$ are mutually independent and $Prob(P_i) = r$ for all $i \geq 1$.
  4. Let $N_k = \sum\limits_{i=1}^k \mathbb{1}_{P_i}$. Show that if $X_0 = 0$ and $X_n = 0$ then $N_n \leq n/2$.
  5. Consider the case when $r > \frac{1}{2}$.
    1. Show that $N_k - \frac{k}{2} \to \infty$ almost surely.
    2. Conclude that every state of the Markov chain is transient.
  6. Consider the case when $r = \frac{1}{2}$.
    1. Show that if $\pi_i$ is the probability mass function of a stationary distribution, then we must have $\pi_k = \pi_0 + k (\pi_1 - \pi_0)$ for $k\geq 1$.
    2. Conclude that a stationary distribution does not exist.
    3. Show that the Markov chain is recurrent (it is straightforward to modify proofs of the corresponding fact for the balanced random walk on $\mathbb{Z}$).