MathIsimple

Stochastic Processes – Problem 51: prove that as , in the convergence sense, and that

Question

Let {ξn,i,n0,i1}\{\xi_{n,i},n\geqslant0,i\geqslant1\} be i.i.d. integer-valued random variables. Define the Galton--Watson branching process {Zn:n0}\{Z_{n}:n\geqslant0\} satisfying:

Zn=i=1Zn1ξn1,i, Z0=1,Z_{n}=\sum_{i=1}^{Z_{n-1}}\xi_{n-1,i},\ Z_{0}=1,

If μ=E[ξn,i]>1\mu = E\left[\xi_{n,i}\right] > 1 and var(ξn,i)=σ2>0var\left(\xi_{n,i}\right) = \sigma^{2} > 0, prove that as nn\rightarrow\infty, ZnμnW\frac{Z_{n}}{\mu^{n}}\rightarrow W in the L2L^{2} convergence sense, and that E[W]=1E\left[W\right]=1.

(Hint: Consider the LpL^{p} convergence theorem for p=2p=2; one needs to show supn[E(Znμn)2]<\sup_{n}\left[E\left({\frac{Z_{n}}{\mu^{n}}}\right)^{2}\right]<\infty.)

Step-by-step solution

Step 1. Define the natural filtration Fn:=σ(ξk,i:0kn1, i1).\mathcal F_n:=\sigma\bigl(\xi_{k,i}:0\le k\le n-1,\ i\ge 1\bigr). Set Xn:=Znμn,μ=E[ξn,i]>1.X_n:=\frac{Z_n}{\mu^n},\qquad \mu=E[\xi_{n,i}]>1. Since E[ZnFn1]=μZn1E[Z_n\mid\mathcal F_{n-1}]=\mu Z_{n-1}, we obtain E[XnFn1]=1μnE[ZnFn1]=μZn1μn=Xn1.E[X_n\mid\mathcal F_{n-1}]=\frac{1}{\mu^n}E[Z_n\mid\mathcal F_{n-1}]=\frac{\mu Z_{n-1}}{\mu^n}=X_{n-1}. Therefore (Xn)(X_n) is a martingale.

Step 2. Compute the second moment recursion. Given Fn1\mathcal F_{n-1}, write Zn=i=1Zn1ξn1,i,Z_n=\sum_{i=1}^{Z_{n-1}}\xi_{n-1,i}, with i.i.d. summands of mean μ\mu and variance σ2\sigma^2. Hence E[Zn2Fn1]=Var(ZnFn1)+(E[ZnFn1])2=σ2Zn1+μ2Zn12.E[Z_n^2\mid\mathcal F_{n-1}]=\operatorname{Var}(Z_n\mid\mathcal F_{n-1})+\bigl(E[Z_n\mid\mathcal F_{n-1}]\bigr)^2 =\sigma^2 Z_{n-1}+\mu^2 Z_{n-1}^2. Divide by μ2n\mu^{2n} and take expectations: E[Xn2]=σ2μ2nE[Zn1]+μ2μ2nE[Zn12]=σ2μn+1+E[Xn12],E[X_n^2]=\frac{\sigma^2}{\mu^{2n}}E[Z_{n-1}]+\frac{\mu^2}{\mu^{2n}}E[Z_{n-1}^2] =\frac{\sigma^2}{\mu^{n+1}}+E[X_{n-1}^2], where we used E[Zn1]=μn1E[Z_{n-1}]=\mu^{n-1}.

Step 3. Sum the recursion from 11 to nn: E[Xn2]=E[X02]+k=1nσ2μk+11+σ2μ(μ1)<.E[X_n^2]=E[X_0^2]+\sum_{k=1}^{n}\frac{\sigma^2}{\mu^{k+1}} \le 1+\frac{\sigma^2}{\mu(\mu-1)}<\infty. So supnE[Xn2]<\sup_n E[X_n^2]<\infty, i.e. (Xn)(X_n) is bounded in L2L^2.

Step 4. By the martingale convergence theorem in L2L^2, there exists an L2L^2 random variable WW such that XnWa.s. and in L2.X_n\to W\quad\text{a.s. and in }L^2. Equivalently, ZnμnWin L2.\frac{Z_n}{\mu^n}\to W\quad\text{in }L^2.

Step 5. Since XnWX_n\to W in L1L^1 as well (because L2L^2-convergence implies L1L^1-convergence), we may pass expectations to the limit: E[W]=limnE[Xn].E[W]=\lim_{n\to\infty}E[X_n]. But (Xn)(X_n) is a martingale with X0=1X_0=1, so E[Xn]=E[X0]=1E[X_n]=E[X_0]=1 for all nn. Hence E[W]=1.E[W]=1.

Therefore, for the supercritical Galton--Watson process with finite offspring variance, ZnμnnL2W,E[W]=1.\frac{Z_n}{\mu^n}\xrightarrow[n\to\infty]{L^2}W, \qquad E[W]=1. QED.

Final answer

QED.

Marking scheme

The following is the rubric for undergraduate mathematics graders.


1. Checkpoints (max 7 pts)

  • Prove {Xn}\{X_n\} is a martingale (2 pts)
  • Correctly derive E[ZnFn1]=μZn1E[Z_n | \mathcal{F}_{n-1}] = \mu Z_{n-1} or E[XnFn1]=Xn1E[X_n | \mathcal{F}_{n-1}] = X_{n-1}. [1 pt]
  • State {Xn}\{X_n\} is a martingale and note E[Xn]=1E[X_n] = 1. [1 pt]
  • Prove L2L^2 boundedness (key difficulty) (3 pts)
  • Establish the second moment/variance recurrence (e.g., Var(Zn)=σ2μn1+μ2Var(Zn1)Var(Z_n) = \sigma^2 \mu^{n-1} + \mu^2 Var(Z_{n-1})). [1 pt]
  • Solve the recurrence to get an explicit expression for E[Xn2]E[X_n^2] (involving a geometric series). [1 pt]
  • Use μ>1\mu > 1 to show the geometric series converges, hence supnE[Xn2]<\sup_n E[X_n^2] < \infty. [1 pt]
  • L2L^2 convergence conclusion (1 pt)
  • Cite the L2L^2 martingale convergence theorem to conclude XnWX_n \to W. [1 pt]
  • Limit expectation (1 pt)
  • State L2L^2 convergence implies L1L^1 convergence (or uniform integrability), hence E[W]=limE[Xn]=1E[W] = \lim E[X_n] = 1. [1 pt]

Total (max 7)


2. Zero-credit items

  • Merely copying definitions without derivation.
  • Listing theorem names without verifying hypotheses.
  • Directly asserting convergence without proving L2L^2 boundedness.

3. Deductions

  • -1: Not mentioning μ>1\mu > 1 when showing series convergence.
  • -1: Not justifying E[limXn]=limE[Xn]E[\lim X_n] = \lim E[X_n].
  • -1: Coefficient errors in the variance recurrence.
  • -1: Confusing ZnZ_n (population size) with XnX_n (normalized variable).
Ask AI ✨