MathIsimple

Probability Theory – Problem 72: Prove that

Question

Given (Xi)i=1(X_{i})_{i=1}^{\infty} i.i.d. with XiN(0,1)X_{i}\sim N(0,1). Let Mn=max1in{X1,,Xn}M_{n}=\operatorname*{max}_{1\leqslant i\leqslant n}\left\{X_{1},\ldots,X_{n}\right\}. Prove that Mn2logna.s.1\frac{M_{n}}{\sqrt{2\log n}}\stackrel{a.s.}{\rightarrow}1.

Step-by-step solution

Step 1. To prove that Mn2logn\frac{M_{n}}{\sqrt{2\log n}} converges almost surely to 1, we need to show that for every ϵ>0\epsilon > 0, both of the following hold: lim supnMn2logn1\limsup_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \le 1 a.s. lim infnMn2logn1\liminf_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \ge 1 a.s. Equivalently, for every ϵ>0\epsilon > 0, the following events have probability zero: P(Mn2logn>1+ϵi.o.)=0P\left(\frac{M_n}{\sqrt{2\log n}} > 1+\epsilon \quad \text{i.o.}\right) = 0 P(Mn2logn<1ϵi.o.)=0P\left(\frac{M_n}{\sqrt{2\log n}} < 1-\epsilon \quad \text{i.o.}\right) = 0 where i.o. stands for "infinitely often."

Step 2. Prove lim supnMn2logn1\limsup_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \le 1 a.s. By the Borel--Cantelli lemma, if a sequence of events AnA_n satisfies n=1P(An)<\sum_{n=1}^\infty P(A_n) < \infty, then P(An i.o.)=0P(A_n \text{ i.o.}) = 0. Let An={Mn2logn>1+ϵ}A_n = \left\{ \frac{M_n}{\sqrt{2\log n}} > 1+\epsilon \right\} for a given ϵ>0\epsilon > 0. P(An)=P(Mn>(1+ϵ)2logn)P(A_n) = P(M_n > (1+\epsilon)\sqrt{2\log n}) Since Mn=max{X1,,Xn}M_n = \max\{X_1, \dots, X_n\}, the event {Mn>x}\{M_n > x\} equals i=1n{Xi>x}\cup_{i=1}^n \{X_i > x\}. By the union bound: P(Mn>(1+ϵ)2logn)i=1nP(Xi>(1+ϵ)2logn)P(M_n > (1+\epsilon)\sqrt{2\log n}) \le \sum_{i=1}^n P(X_i > (1+\epsilon)\sqrt{2\log n}) Since the XiX_i are i.i.d.: P(An)nP(X1>(1+ϵ)2logn)P(A_n) \le n P(X_1 > (1+\epsilon)\sqrt{2\log n}) Using the standard normal tail bound: for x>0x>0, P(X1>x)<1x2πex2/2P(X_1 > x) < \frac{1}{x\sqrt{2\pi}}e^{-x^2/2}. Let xn=(1+ϵ)2lognx_n = (1+\epsilon)\sqrt{2\log n}. Then xn2=(1+ϵ)2(2logn)x_n^2 = (1+\epsilon)^2(2\log n). P(X1>xn)<1(1+ϵ)2logn2πe(1+ϵ)2logn=1(1+ϵ)4πlognn(1+ϵ)2P(X_1 > x_n) < \frac{1}{(1+\epsilon)\sqrt{2\log n}\sqrt{2\pi}} e^{-(1+\epsilon)^2\log n} = \frac{1}{(1+\epsilon)\sqrt{4\pi\log n}} n^{-(1+\epsilon)^2} Substituting into the bound for P(An)P(A_n): P(An)n1(1+ϵ)4πlognn(1+ϵ)2=1(1+ϵ)4πlognn1(1+ϵ)2P(A_n) \le n \cdot \frac{1}{(1+\epsilon)\sqrt{4\pi\log n}} n^{-(1+\epsilon)^2} = \frac{1}{(1+\epsilon)\sqrt{4\pi\log n}} n^{1-(1+\epsilon)^2} The exponent is 1(1+ϵ)2=1(1+2ϵ+ϵ2)=2ϵϵ21-(1+\epsilon)^2 = 1-(1+2\epsilon+\epsilon^2) = -2\epsilon-\epsilon^2. Since ϵ>0\epsilon > 0, the exponent 2ϵϵ2-2\epsilon-\epsilon^2 is negative. Therefore the series n=1P(An)\sum_{n=1}^\infty P(A_n) converges by comparison with a pp-series. By the Borel--Cantelli lemma, P(An i.o.)=0P(A_n \text{ i.o.}) = 0. Since this holds for every ϵ>0\epsilon > 0: lim supnMn2logn1\limsup_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \le 1 a.s.

Step 3. Prove lim infnMn2logn1\liminf_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \ge 1 a.s. For 0<ϵ<10 < \epsilon < 1, consider an exponentially growing subsequence nk=αkn_k = \lfloor \alpha^k \rfloor where α>1\alpha > 1. Let dn=(1ϵ)2lognd_n = (1-\epsilon)\sqrt{2\log n}. First show P(Mnk<dnk i.o.)=0P(M_{n_k} < d_{n_k} \text{ i.o.}) = 0. P(Mnk<dnk)=(P(X1<dnk))nk=(1P(X1>dnk))nkP(M_{n_k} < d_{n_k}) = (P(X_1 < d_{n_k}))^{n_k} = (1 - P(X_1 > d_{n_k}))^{n_k} Using 1xex1-x \le e^{-x}: P(Mnk<dnk)exp(nkP(X1>dnk))P(M_{n_k} < d_{n_k}) \le \exp(-n_k P(X_1 > d_{n_k})) Using the normal tail lower bound and setting xk=dnk=(1ϵ)2lognkx_k = d_{n_k} = (1-\epsilon)\sqrt{2\log n_k}: nkP(X1>dnk)nk2ϵϵ2(1ϵ)4πlognkn_k P(X_1 > d_{n_k}) \sim \frac{n_k^{2\epsilon-\epsilon^2}}{(1-\epsilon)\sqrt{4\pi\log n_k}} Since 2ϵϵ2=ϵ(2ϵ)>02\epsilon-\epsilon^2 = \epsilon(2-\epsilon) > 0, we have nkP(X1>dnk)n_k P(X_1 > d_{n_k}) \to \infty. Thus P(Mnk<dnk)P(M_{n_k} < d_{n_k}) decays faster than any geometric series, so k=1P(Mnk<dnk)\sum_{k=1}^\infty P(M_{n_k} < d_{n_k}) converges. By the Borel--Cantelli lemma, P(Mnk<dnk i.o.)=0P(M_{n_k} < d_{n_k} \text{ i.o.}) = 0.

Step 4. Extend from the subsequence to the full sequence. For any integer nn, find kk such that nkn<nk+1n_k \le n < n_{k+1}. Since MnM_n is nondecreasing, MnMnkM_n \ge M_{n_k}. For sufficiently large nn (hence k>K(ω)k > K(\omega)): Mn2lognMnk2logndnk2logn=(1ϵ)lognklogn\frac{M_n}{\sqrt{2\log n}} \ge \frac{M_{n_k}}{\sqrt{2\log n}} \ge \frac{d_{n_k}}{\sqrt{2\log n}} = (1-\epsilon)\sqrt{\frac{\log n_k}{\log n}} Since lognklognk+11\frac{\log n_k}{\log n_{k+1}} \to 1 as kk \to \infty, by the squeeze theorem lognklogn1\frac{\log n_k}{\log n} \to 1. Therefore lim infnMn2logn(1ϵ)\liminf_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \ge (1-\epsilon). Since this holds for every 0<ϵ<10 < \epsilon < 1: lim infnMn2logn1\liminf_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \ge 1 a.s.

Step 5. Combining Steps 2 and 4: lim supnMn2logn1\limsup_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \le 1 a.s. lim infnMn2logn1\liminf_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \ge 1 a.s. Both inequalities hold simultaneously, so the limit exists and equals 1, almost surely. limnMn2logn=1\lim_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} = 1 a.s.

Final answer

QED.

Marking scheme

The following is the marking rubric based on the official solution. Please follow this rubric strictly.

1. Checkpoints (Total 7 pts)

Divide the proof into the following three logical modules. Points within each module are additive, and points across modules are also additive.

Part 1: Upper bound estimate (lim supnMn2logn1\limsup_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \le 1 a.s.) [Max 3]

  • Constructing the probability upper bound: Use the union bound P(Mn>x)nP(X1>x)P(M_n > x) \le nP(X_1 > x) combined with the normal tail estimate (P(X>x)ex2/2/xP(X>x) \le e^{-x^2/2}/x or similar form) to establish an inequality for P(Mn>(1+ϵ)2logn)P(M_n > (1+\epsilon)\sqrt{2\log n}). [1 pt]
  • Series convergence analysis: Substitute xn=(1+ϵ)2lognx_n = (1+\epsilon)\sqrt{2\log n}, and through algebraic manipulation show that the probability term decays at rate npn^{-p} (or similar form), then argue that P(An)\sum P(A_n) converges. [1 pt]
  • Borel--Cantelli conclusion: Invoke the first part of the Borel--Cantelli lemma to conclude that for every ϵ>0\epsilon>0, lim sup1+ϵ\limsup \le 1+\epsilon holds almost surely. [1 pt]
  • *Note: If the student uses a subsequence method to prove the upper bound and the logic is correct, full credit is also awarded.*

Part 2: Lower bound subsequence estimate (lim infnMn2logn1\liminf_{n\to\infty} \frac{M_n}{\sqrt{2\log n}} \ge 1 a.s.) [Max 2]

  • Subsequence and probability bounding: Choose an exponentially growing subsequence nkn_k (e.g., αk\alpha^k), and use the independence formula (1p)nenp(1-p)^n \le e^{-np} to bound the event {Mnk<(1ϵ)2lognk}\{M_{n_k} < (1-\epsilon)\sqrt{2\log n_k}\}. [1 pt]
  • Subsequence convergence conclusion: Prove that the corresponding probability series converges, and by the Borel--Cantelli lemma assert that Mnk(1ϵ)2lognkM_{n_k} \ge (1-\epsilon)\sqrt{2\log n_k} eventually holds almost surely. [1 pt]

Part 3: Density extension and combined conclusion (Extension to full sequence) [Max 2]

  • Using monotonicity to fill gaps: Use the nondecreasing property of MnM_n to state that for nkn<nk+1n_k \le n < n_{k+1}, we have MnMnkM_n \ge M_{n_k}. [1 pt]
  • Squeeze limit argument: By arguing that lognklognk+11\frac{\sqrt{\log n_k}}{\sqrt{\log n_{k+1}}} \to 1 (or a similar ratio limit), extend the subsequence lower bound to the full sequence, and combine with Part 1 to state the final conclusion. [1 pt]

Total (max 7)


2. Zero-credit items

  • Merely copying the problem statement or listing the given conditions (XiN(0,1)X_i \sim N(0,1)).
  • Only computing the limit of E[Mn]E[M_n] or Var(Mn)\text{Var}(M_n) without addressing the probability series criterion for almost sure convergence.
  • Only citing the name "extreme value theory" or "Gumbel distribution" without any concrete series convergence proof.
  • Incorrectly asserting XnX_n \to \infty or Xn0X_n \to 0 almost surely.

3. Deductions

For clear logical gaps or writing errors, apply the following single maximum deduction:

  • Missing logical quantifiers: The proof does not reflect "for every ϵ>0\epsilon > 0" or fails to let ϵ0\epsilon \to 0 at the end, making the conclusion imprecise. (-1 pt)
  • Confusing convergence types: Proves convergence in probability (P(>ϵ)0P(|\dots|>\epsilon) \to 0) but does not verify series convergence (i.e., does not sufficiently prove almost sure convergence). (cap at 4/7)
  • Skipping the subsequence step: In the lower bound proof, directly applies the Borel--Cantelli lemma to all nn (which typically causes the series to diverge, a logical error) without using a subsequence. (-2 pts)
  • Missing monotonicity: When extending from the subsequence to the full sequence, fails to mention the monotonicity of MnM_n and directly assumes the limit exists. (-1 pt)
Ask AI ✨