MathIsimple

Probability Theory – Problem 19: Prove that: (1)

Question

Let X1,XnX_{1},\cdot\cdot\cdot X_{n} be i.i.d. random variables satisfying P(Xi  >  x)  =  ex\mathbb{P}(X_{i}\;>\;x)\;=\;e^{-x}, and let Mn  =M_{n}\;=max1mnXm\operatorname*{max}_{1\leq m\leq n}X_{m}. Prove that: (1) limnsupMnXnlogn=1 a.s.\operatorname*{lim}_{n\to\infty}\operatorname*{sup}_{M_{n}}\frac{X_{n}}{\log n}=1\quad\mathrm{~a.s.} (2) limnMnlogn=1 a.s.\operatorname*{lim}_{n\to\infty}\frac{M_{n}}{\log n}=1\quad\mathrm{~a.s.}

Step-by-step solution

(1) Proof that lim supnXnlogn=1a.s.\limsup_{n \to \infty} \frac{X_n}{\log n} = 1 \quad \text{a.s.}

Step 1. Prove that lim supnXnlogn1\limsup_{n \to \infty} \frac{X_n}{\log n} \le 1 a.s. By the definition of the limit superior, it suffices to show that for every ϵ>0\epsilon > 0, the event {Xnlogn>1+ϵ}\left\{ \frac{X_n}{\log n} > 1 + \epsilon \right\} can occur only finitely many times (almost surely). By the first Borel–Cantelli lemma, it suffices to show that the series n=1P(Xnlogn>1+ϵ)\sum_{n=1}^\infty P\left(\frac{X_n}{\log n} > 1 + \epsilon\right) converges. Using the given condition P(Xi>x)=exP(X_i > x) = e^{-x}, we compute the probability: P(Xnlogn>1+ϵ)=P(Xn>(1+ϵ)logn)P\left(\frac{X_n}{\log n} > 1 + \epsilon\right) = P(X_n > (1+\epsilon)\log n) Setting x=(1+ϵ)lognx = (1+\epsilon)\log n and substituting into the tail probability formula: P(Xn>(1+ϵ)logn)=e(1+ϵ)logn=(elogn)(1+ϵ)=n(1+ϵ)P(X_n > (1+\epsilon)\log n) = e^{-(1+\epsilon)\log n} = (e^{\log n})^{-(1+\epsilon)} = n^{-(1+\epsilon)} Consider the series n=1n(1+ϵ)\sum_{n=1}^\infty n^{-(1+\epsilon)}. Since ϵ>0\epsilon > 0, we have 1+ϵ>11+\epsilon > 1. This is a pp-series with p=1+ϵ>1p = 1+\epsilon > 1, and therefore it converges. By the first Borel–Cantelli lemma, the probability that the event {Xnlogn>1+ϵ}\left\{ \frac{X_n}{\log n} > 1 + \epsilon \right\} occurs infinitely often is zero. That is, almost surely, for every ϵ>0\epsilon > 0, lim supnXnlogn1+ϵ\limsup_{n \to \infty} \frac{X_n}{\log n} \le 1 + \epsilon. Since ϵ\epsilon can be taken arbitrarily small, we conclude that lim supnXnlogn1\limsup_{n \to \infty} \frac{X_n}{\log n} \le 1 a.s.

Step 2. Prove that lim supnXnlogn1\limsup_{n \to \infty} \frac{X_n}{\log n} \ge 1 a.s. It suffices to show that for every 0<ϵ<10 < \epsilon < 1, the event {Xnlogn>1ϵ}\left\{ \frac{X_n}{\log n} > 1 - \epsilon \right\} occurs infinitely often (almost surely). Since the XnX_n are i.i.d., the events An={Xnlogn>1ϵ}A_n = \left\{ \frac{X_n}{\log n} > 1 - \epsilon \right\} are mutually independent. By the second Borel–Cantelli lemma, if n=1P(An)\sum_{n=1}^\infty P(A_n) diverges, then P(An i.o.)=1P(A_n \text{ i.o.}) = 1. We compute P(An)P(A_n): P(An)=P(Xnlogn>1ϵ)=P(Xn>(1ϵ)logn)P(A_n) = P\left(\frac{X_n}{\log n} > 1 - \epsilon\right) = P(X_n > (1-\epsilon)\log n) =e(1ϵ)logn=n(1ϵ)= e^{-(1-\epsilon)\log n} = n^{-(1-\epsilon)} Consider the series n=1n(1ϵ)\sum_{n=1}^\infty n^{-(1-\epsilon)}. Since 0<ϵ<10 < \epsilon < 1, we have 1ϵ<11-\epsilon < 1. This is a pp-series with p=1ϵ<1p = 1-\epsilon < 1, and therefore it diverges. By the second Borel–Cantelli lemma, the probability that the event AnA_n occurs infinitely often is 11. That is, almost surely, for every 0<ϵ<10 < \epsilon < 1, lim supnXnlogn1ϵ\limsup_{n \to \infty} \frac{X_n}{\log n} \ge 1 - \epsilon. Since ϵ\epsilon can be taken arbitrarily small, we conclude that lim supnXnlogn1\limsup_{n \to \infty} \frac{X_n}{\log n} \ge 1 a.s.

Step 3. Combining the conclusions of Step 1 and Step 2, namely lim supnXnlogn1\limsup_{n \to \infty} \frac{X_n}{\log n} \le 1 a.s. and lim supnXnlogn1\limsup_{n \to \infty} \frac{X_n}{\log n} \ge 1 a.s.

(2) Proof that limnMnlogn=1a.s.\lim_{n \to \infty} \frac{M_n}{\log n} = 1 \quad \text{a.s.} To show that the limit exists and equals 11, we must prove separately that the limit superior is at most 11 and the limit inferior is at least 11.

Step 4. Prove that lim supnMnlogn1\limsup_{n \to \infty} \frac{M_n}{\log n} \le 1 a.s. Mn=max1mnXmM_n = \max_{1 \le m \le n} X_m is a non-decreasing sequence. From the proof of part (1), we know that lim supnXnlogn=1\limsup_{n \to \infty} \frac{X_n}{\log n} = 1 a.s. This means that for almost every sample point, for any given δ>0\delta > 0, there exists a positive integer NN such that Xn<(1+δ)lognX_n < (1+\delta)\log n for all n>Nn > N. For any n>Nn > N, we can decompose MnM_n as: Mn=max(X1,,XN,XN+1,,Xn)M_n = \max(X_1, \ldots, X_N, X_{N+1}, \ldots, X_n) Mnmax(max(X1,,XN),max(XN+1,,Xn))M_n \le \max(\max(X_1, \ldots, X_N), \max(X_{N+1}, \ldots, X_n)) Let C=max(X1,,XN)C = \max(X_1, \ldots, X_N), which is a finite random variable. For m>Nm > N, we have Xm<(1+δ)logm(1+δ)lognX_m < (1+\delta)\log m \le (1+\delta)\log n. Therefore, max(XN+1,,Xn)<(1+δ)logn\max(X_{N+1}, \ldots, X_n) < (1+\delta)\log n. Hence, for n>Nn > N, Mn<max(C,(1+δ)logn)M_n < \max(C, (1+\delta)\log n). Dividing both sides by logn\log n: Mnlogn<max(C,(1+δ)logn)logn=max(Clogn,1+δ)\frac{M_n}{\log n} < \frac{\max(C, (1+\delta)\log n)}{\log n} = \max\left(\frac{C}{\log n}, 1+\delta\right) As nn \to \infty, Clogn0\frac{C}{\log n} \to 0. Therefore, for all sufficiently large nn, Mnlogn<1+δ\frac{M_n}{\log n} < 1+\delta. This implies lim supnMnlogn1+δ\limsup_{n \to \infty} \frac{M_n}{\log n} \le 1+\delta. Since δ\delta is an arbitrary positive number, we conclude that lim supnMnlogn1\limsup_{n \to \infty} \frac{M_n}{\log n} \le 1 a.s.

Step 5. Prove that lim infnMnlogn1\liminf_{n \to \infty} \frac{M_n}{\log n} \ge 1 a.s. It suffices to show that for every 0<ϵ<10 < \epsilon < 1, the event {Mnlogn<1ϵ}\left\{ \frac{M_n}{\log n} < 1 - \epsilon \right\} can occur only finitely many times (almost surely). By the first Borel–Cantelli lemma, it suffices to show that the series n=1P(Mnlogn<1ϵ)\sum_{n=1}^\infty P\left(\frac{M_n}{\log n} < 1 - \epsilon\right) converges. We compute the probability: P(Mnlogn<1ϵ)=P(Mn<(1ϵ)logn)P\left(\frac{M_n}{\log n} < 1 - \epsilon\right) = P(M_n < (1-\epsilon)\log n) =P(max(X1,,Xn)<(1ϵ)logn)= P(\max(X_1, \ldots, X_n) < (1-\epsilon)\log n) Since the XiX_i are i.i.d., this equals: =P(X1<(1ϵ)logn,,Xn<(1ϵ)logn)= P(X_1 < (1-\epsilon)\log n, \ldots, X_n < (1-\epsilon)\log n) =[P(X1<(1ϵ)logn)]n= [P(X_1 < (1-\epsilon)\log n)]^n Since P(X1<x)=1P(X1>x)=1exP(X_1 < x) = 1 - P(X_1 > x) = 1 - e^{-x}, the probability becomes: =[1e(1ϵ)logn]n=[1n(1ϵ)]n= [1 - e^{-(1-\epsilon)\log n}]^n = [1 - n^{-(1-\epsilon)}]^n As nn \to \infty, let un=n(1ϵ)u_n = n^{-(1-\epsilon)}, so that un0u_n \to 0. Using the limit (1x)1/xe1(1-x)^{1/x} \to e^{-1}, we obtain: [1n(1ϵ)]n=([1n(1ϵ)]1n(1ϵ))nn(1ϵ)(e1)nϵ=enϵ[1 - n^{-(1-\epsilon)}]^n = \left( [1 - n^{-(1-\epsilon)}]^{\frac{1}{n^{-(1-\epsilon)}}} \right)^{n \cdot n^{-(1-\epsilon)}} \approx (e^{-1})^{n^\epsilon} = e^{-n^\epsilon} Consider the series n=1enϵ\sum_{n=1}^\infty e^{-n^\epsilon}. Since ϵ>0\epsilon > 0, nϵn^\epsilon grows with nn, so enϵe^{-n^\epsilon} tends to zero extremely rapidly, faster than any polynomial npn^{-p} (p>1p>1). For example, comparing with the convergent series 1n2\sum \frac{1}{n^2}, we have limnenϵ1/n2=limnn2enϵ=0\lim_{n\to\infty} \frac{e^{-n^\epsilon}}{1/n^2} = \lim_{n\to\infty} \frac{n^2}{e^{n^\epsilon}} = 0. Therefore, the series n=1enϵ\sum_{n=1}^\infty e^{-n^\epsilon} converges. By the first Borel–Cantelli lemma, the probability that the event {Mnlogn<1ϵ}\left\{ \frac{M_n}{\log n} < 1 - \epsilon \right\} occurs infinitely often is zero. That is, almost surely, for every 0<ϵ<10 < \epsilon < 1, lim infnMnlogn1ϵ\liminf_{n \to \infty} \frac{M_n}{\log n} \ge 1 - \epsilon. Since ϵ\epsilon can be taken arbitrarily small, we conclude that lim infnMnlogn1\liminf_{n \to \infty} \frac{M_n}{\log n} \ge 1 a.s.

Step 6. Combining the conclusions of Step 4 and Step 5, namely lim supnMnlogn1\limsup_{n \to \infty} \frac{M_n}{\log n} \le 1 a.s. and lim infnMnlogn1\liminf_{n \to \infty} \frac{M_n}{\log n} \ge 1 a.s., we obtain the final conclusion.

Final answer

QED.

Marking scheme

The following is the grading rubric for this problem, with a total of 7 points.

1. Checkpoints (max 7 pts total)

Part (1): lim supXn/logn\limsup X_n / \log n (3 pts)

  • Tail probability computation
  • Correctly computes the general form of the tail probability P(Xn>λlogn)=nλP(X_n > \lambda \log n) = n^{-\lambda}, or the specific expression for λ=1±ϵ\lambda = 1 \pm \epsilon.
  • [1 pt]
  • Upper bound for the lim sup (1\le 1)
  • States that n(1+ϵ)\sum n^{-(1+\epsilon)} converges and invokes the first Borel–Cantelli lemma to conclude lim supXnlogn1\limsup \frac{X_n}{\log n} \le 1 a.s.
  • [1 pt]
  • Lower bound for the lim sup (1\ge 1)
  • States that n(1ϵ)\sum n^{-(1-\epsilon)} diverges, explicitly mentions independence (i.i.d.), and invokes the second Borel–Cantelli lemma to conclude lim supXnlogn1\limsup \frac{X_n}{\log n} \ge 1 a.s.
  • [1 pt]

Part (2): limMn/logn\lim M_n / \log n (4 pts)

  • Upper bound argument for the maximum (1\le 1)
  • Uses the conclusion of part (1) (namely that XnX_n is eventually almost surely less than (1+δ)logn(1+\delta)\log n) to deduce the asymptotic behavior of MnM_n.
  • Key logical requirement: Must handle the maximum over the finite prefix. That is, the argument must show Mnmax(C,(1+δ)logn)M_n \le \max(C, (1+\delta)\log n) and that the constant term satisfies C/logn0C/\log n \to 0.
  • *If the conclusion is merely asserted to follow directly from (1) without addressing the effect of the first NN terms, award 1 point.*
  • [2 pts]
  • Probability setup for the lower bound of the maximum (1\ge 1)
  • Formulates the event {Mn<(1ϵ)logn}\{M_n < (1-\epsilon)\log n\} and uses independence to write its probability as:

P(Mn<(1ϵ)logn)=(1n(1ϵ))nP(M_n < (1-\epsilon)\log n) = (1 - n^{-(1-\epsilon)})^n

  • [1 pt]
  • Convergence analysis for the lower bound of the maximum
  • Correctly analyzes the asymptotic behavior of the above probability (e.g., enϵ\approx e^{-n^\epsilon} or shows it is dominated by a summable function), and establishes that the series converges.
  • Invokes the first Borel–Cantelli lemma to conclude lim infMnlogn1\liminf \frac{M_n}{\log n} \ge 1 a.s.
  • [1 pt]

Total (max 7)


2. Zero-credit items

  • Proving only convergence in probability: Only computing limnP()=0\lim_{n\to\infty} P(\dots) = 0 without discussing almost sure convergence via series convergence (the Borel–Cantelli lemmas are the core of this problem).
  • Incorrect series convergence test: Claiming that nϵ\sum n^{-\epsilon} (ϵ1\epsilon \le 1) converges.
  • Direct substitution: In Part (2), directly assuming MnXnM_n \approx X_n without any probabilistic estimate or logical argument.
  • Citing the wrong lemma: In the upper bound proof of Part (2), attempting to apply BC1 directly to MnM_n but obtaining the divergent series nϵ\sum n^{-\epsilon} and incorrectly claiming it converges.

3. Deductions

  • Missing independence statement (-1): When applying the second Borel–Cantelli lemma (in the lower bound proof of Part 1), failing to mention the "independence" or "i.i.d." condition.
  • Logical gap / incorrect conclusion (-1): In the lower bound proof of Part 2, directly asserting that the series converges without performing any asymptotic analysis of (1n(1ϵ))n(1 - n^{-(1-\epsilon)})^n (such as taking logarithms or using exponential approximation).
  • Confusing constants and variables (-1): Treating ϵ\epsilon as a quantity depending on nn, or failing to state the arbitrariness of ϵ\epsilon, thereby compromising the rigor of the proof (deduct only when this seriously affects the proof structure).
Ask AI ✨