MathIsimple

Stochastic Processes – Problem 16: Find the stationary distribution of this Markov chain

Question

Let M={(a1,,an){0,1}n:aiai+1=0  i=1,,n1}\mathcal{M}=\{(a_{1},\ldots,a_{n})\in\{0,1\}^{n}:a_{i}a_{i+1}=0\;\forall\, i=1,\ldots,n-1\}. Consider the following Markov chain with state space M\mathcal{M}: at each step, a coordinate ii is chosen uniformly at random; if all of its neighbors (if i=1i=1 or i=ni=n, there is only one neighbor) have value 0, then aia_i is updated to 0 with probability 1/21/2 and to 1 with probability 1/21/2; otherwise the state remains unchanged.

(a) Find the stationary distribution of this Markov chain.

(b) As tt\to\infty, given that the state contains exactly \ell ones (assume n/3\ell\leq n/3), find the limiting conditional probability that the state is a2i=1a_{2i}=1 for all i=1,,i=1,\ldots,\ell (and these are all the positions equal to 1).

Step-by-step solution

(a) Finding the stationary distribution.

We check whether the chain is reversible. For any two states x,yMx, y \in \mathcal{M}: If xx and yy differ in more than one coordinate, then P(x,y)=P(y,x)=0P(x,y) = P(y,x) = 0, so detailed balance holds trivially.

If xx and yy differ only in coordinate ii, say xx has ai=0a_i=0 and yy has ai=1a_i=1, then for both x,yx,y to lie in M\mathcal{M}, the neighbors of ii must all be 0. In state xx, selecting coordinate ii is allowed (neighbors are 0), and the update probability is 1/(2n)1/(2n). Similarly, in state yy, selecting coordinate ii is also allowed (neighbors are still 0), giving P(y,x)=1/(2n)P(y,x) = 1/(2n).

Since P(x,y)=P(y,x)P(x,y) = P(y,x) for all x,yx,y, the transition matrix is symmetric. By detailed balance with a uniform candidate π(a)=1/M\pi(a) = 1/|\mathcal{M}|, we verify π(x)P(x,y)=π(y)P(y,x)\pi(x)P(x,y) = \pi(y)P(y,x).

The chain is irreducible (any state can reach any other by flipping coordinates one at a time through the all-zeros state). Since the state space is finite and the chain is irreducible, the stationary distribution is unique: π(a)=1M\pi(a) = \frac{1}{|\mathcal{M}|} for all aMa \in \mathcal{M}.

(b) Finding the limiting conditional probability.

The chain is irreducible and aperiodic (since P(a,a)>0P(a,a)>0), so as tt\to\infty the distribution converges to π\pi.

Let AA be the specific state with a2i=1a_{2i}=1 for i=1,,i=1,\ldots,\ell and all other entries 0, and let CC_\ell be the set of all states in M\mathcal{M} with exactly \ell ones. Since π\pi is uniform: P(AC)=1CP(A | C_\ell) = \frac{1}{|C_\ell|}.

To count C|C_\ell|: we need to place \ell ones among nn positions with no two adjacent. Using the stars-and-bars method (placing n2+1n-2\ell+1 remaining zeros into +1\ell+1 gaps), we get: C=(n+1)|C_\ell| = \binom{n-\ell+1}{\ell}.

Therefore the limiting conditional probability is 1(n+1)\frac{1}{\binom{n-\ell+1}{\ell}}.

Final answer

(a) 1M\frac{1}{|\mathcal{M}|} (b) 1(n+1)\frac{1}{\binom{n-\ell+1}{\ell}}

Marking scheme

The following is the rubric based on the official solution.


1. Checkpoints (max 7 pts total)

(a) Finding the stationary distribution (3 pts)

  • Transition probability symmetry: Explicitly compute the transition probability between adjacent states as P(x,y)=12nP(x,y) = \frac{1}{2n} and note P(x,y)=P(y,x)P(x,y)=P(y,x) (or state the transition matrix is symmetric). [1 pt]
  • Uniform distribution conclusion: Use detailed balance (or double stochasticity) to conclude π\pi is uniform on M\mathcal{M}. [1 pt]
  • Uniqueness/ergodicity: Mention irreducibility to guarantee uniqueness of the stationary distribution. [1 pt]

(b) Finding the limiting conditional probability (4 pts)

  • Connecting the limit to the stationary distribution: State that as tt\to\infty the distribution converges to π\pi, and since π\pi is uniform, the conditional probability equals 1/C1/|C_\ell|. [1 pt]
  • Combinatorial counting setup: Establish the correct counting model for C|C_\ell| (e.g., stars-and-bars for non-adjacent ones, or the classical non-adjacent selection formula). [1 pt]
  • Counting result: Correctly obtain C=(n+1)|C_\ell| = \binom{n-\ell+1}{\ell}. [1 pt]
  • Final result: Conclude the probability is 1(n+1)\frac{1}{\binom{n-\ell+1}{\ell}} (or equivalent form). [1 pt]

Total (max 7)


2. Zero-credit items

  • Merely copying the problem conditions or defining M\mathcal{M} without any probabilistic or combinatorial derivation.
  • In (b), incorrectly assuming the coordinates aia_i are mutually independent.
  • Giving only a specific numerical answer without a general formula (this is a parametric problem).
  • In (a), merely guessing uniform distribution without providing symmetry or balance equation justification.

3. Deductions

  • (a) Missing logic: Proving uniform is stationary but not mentioning irreducibility or uniqueness: -1 pt.
  • Combinatorial error: Correct stars-and-bars logic but wrong binomial coefficient subscript (e.g., (n)\binom{n-\ell}{\ell}): -1 pt.
  • Notation confusion: Confusing conditional probability with joint probability (e.g., failing to take the reciprocal): -1 pt.
  • Deductions cannot reduce a sub-part below 0.
Ask AI ✨