MathIsimple

Probability Theory – Problem 71: Let be independent of the -field filtration , with

Question

Let {Xn}\{X_{n}\} be independent of the σ\sigma-field filtration {Fn}\{\mathcal{F}_{n}\}, with EXn=0,supnEXn<E X_{n}=0,\operatorname*{sup}_{n}E|X_{n}|<\infty. Let TT be a stopping time with respect to the filtration {Fn}\{\mathcal{F}_{n}\}, and ET<E T<\infty. Then Ej=1TXj=0E\sum_{j=1}^{T}X_{j}=0.

Step-by-step solution

Step 1. Let ST=j=1TXjS_T = \sum_{j=1}^{T}X_{j}. Express this sum as an infinite series using indicator functions: ST=j=1XjI{Tj}S_T = \sum_{j=1}^{\infty} X_j I_{\{T \ge j\}} where I{Tj}I_{\{T \ge j\}} is the indicator function of the event {Tj}\{T \ge j\}. The goal is to compute E[ST]=E[j=1XjI{Tj}]E[S_T] = E[\sum_{j=1}^{\infty} X_j I_{\{T \ge j\}}].

Step 2. Verify the conditions of the Fubini--Tonelli theorem, i.e., show that E[j=1XjI{Tj}]<E[\sum_{j=1}^{\infty} |X_j I_{\{T \ge j\}}|] < \infty. Since each term of the series is nonnegative, the expectation and summation can be interchanged: E[j=1XjI{Tj}]=j=1E[XjI{Tj}]E[\sum_{j=1}^{\infty} |X_j| I_{\{T \ge j\}}] = \sum_{j=1}^{\infty} E[|X_j| I_{\{T \ge j\}}] For each term E[XjI{Tj}]E[|X_j| I_{\{T \ge j\}}], apply the tower property of conditional expectation, conditioning on the σ\sigma-field Fj1\mathcal{F}_{j-1}. We need to clarify the measurability of the indicator I{Tj}I_{\{T \ge j\}}. Since TT is a stopping time with respect to {Fn}\{\mathcal{F}_{n}\}, the event {T=k}Fk\{T=k\} \in \mathcal{F}_k for every k1k \ge 1. Therefore, the event {Tj1}=k=1j1{T=k}\{T \le j-1\} = \bigcup_{k=1}^{j-1} \{T=k\} belongs to Fj1\mathcal{F}_{j-1} (since FkFj1\mathcal{F}_k \subseteq \mathcal{F}_{j-1} for all kj1k \le j-1). Hence its complement {Tj}\{T \ge j\} also belongs to Fj1\mathcal{F}_{j-1}, meaning I{Tj}I_{\{T \ge j\}} is Fj1\mathcal{F}_{j-1}-measurable. Applying the tower property: E[XjI{Tj}]=E[E[XjI{Tj}Fj1]]E[|X_j| I_{\{T \ge j\}}] = E[E[|X_j| I_{\{T \ge j\}} | \mathcal{F}_{j-1}]] Since I{Tj}I_{\{T \ge j\}} is Fj1\mathcal{F}_{j-1}-measurable, it can be factored out of the conditional expectation: E[XjI{Tj}]=E[I{Tj}E[XjFj1]]E[|X_j| I_{\{T \ge j\}}] = E[I_{\{T \ge j\}} E[|X_j| | \mathcal{F}_{j-1}]] By hypothesis, {Xn}\{X_{n}\} is independent of {Fn}\{\mathcal{F}_{n}\}, which implies that for every jj, XjX_j is independent of Fj1\mathcal{F}_{j-1}. Therefore the conditional expectation equals the unconditional expectation: E[XjFj1]=E[Xj]E[|X_j| | \mathcal{F}_{j-1}] = E[|X_j|] Substituting back: E[XjI{Tj}]=E[I{Tj}E[Xj]]=E[Xj]E[I{Tj}]=E[Xj]P(Tj)E[|X_j| I_{\{T \ge j\}}] = E[I_{\{T \ge j\}} E[|X_j|]] = E[|X_j|] E[I_{\{T \ge j\}}] = E[|X_j|] P(T \ge j) Returning to the original series: j=1E[Xj]P(Tj)\sum_{j=1}^{\infty} E[|X_j|] P(T \ge j) By hypothesis supnEXn<\operatorname*{sup}_{n}E|X_{n}| < \infty, so there exists a constant C<C < \infty such that EXnCE|X_n| \le C for all nn. Thus: j=1E[Xj]P(Tj)j=1CP(Tj)=Cj=1P(Tj)\sum_{j=1}^{\infty} E[|X_j|] P(T \ge j) \le \sum_{j=1}^{\infty} C \cdot P(T \ge j) = C \sum_{j=1}^{\infty} P(T \ge j) For a nonnegative integer-valued random variable TT, its expectation can be expressed as ET=j=1P(Tj)ET = \sum_{j=1}^{\infty} P(T \ge j). By hypothesis ET<ET < \infty, so the above series converges. In summary, E[j=1XjI{Tj}]CET<E[\sum_{j=1}^{\infty} |X_j I_{\{T \ge j\}}|] \le C \cdot ET < \infty. The interchange of expectation and summation is justified.

Step 3. Compute the expectation. By the conclusion of Step 2: E[j=1TXj]=E[j=1XjI{Tj}]=j=1E[XjI{Tj}]E[\sum_{j=1}^{T}X_{j}] = E[\sum_{j=1}^{\infty} X_j I_{\{T \ge j\}}] = \sum_{j=1}^{\infty} E[X_j I_{\{T \ge j\}}] For each term E[XjI{Tj}]E[X_j I_{\{T \ge j\}}], apply the same argument as in Step 2: E[XjI{Tj}]=E[E[XjI{Tj}Fj1]]E[X_j I_{\{T \ge j\}}] = E[E[X_j I_{\{T \ge j\}} | \mathcal{F}_{j-1}]] Factor out the Fj1\mathcal{F}_{j-1}-measurable I{Tj}I_{\{T \ge j\}}: E[XjI{Tj}]=E[I{Tj}E[XjFj1]]E[X_j I_{\{T \ge j\}}] = E[I_{\{T \ge j\}} E[X_j | \mathcal{F}_{j-1}]] Since XjX_j is independent of Fj1\mathcal{F}_{j-1} and EXj=0EX_j = 0: E[XjFj1]=E[Xj]=0E[X_j | \mathcal{F}_{j-1}] = E[X_j] = 0 Therefore each term is zero: E[XjI{Tj}]=E[I{Tj}0]=0E[X_j I_{\{T \ge j\}}] = E[I_{\{T \ge j\}} \cdot 0] = 0

Step 4. Combining the results for all terms, we obtain the conclusion. Since every term of the series is 0, the total sum is also 0: Ej=1TXj=j=10=0E\sum_{j=1}^{T}X_{j} = \sum_{j=1}^{\infty} 0 = 0

Final answer

QED.

Marking scheme

The following is the marking rubric for this probability theory problem (total: 7 points).

1. Checkpoints (Total 7 pts)

  • Series Representation [1 pt]
  • Introduce indicator functions to convert the random-length sum STS_T into an infinite series: j=1XjI{Tj}\sum_{j=1}^{\infty} X_j I_{\{T \ge j\}} (or an equivalent form). [1 pt]
  • *(Note: If the student does not explicitly write this expression but correctly uses the indicator function logic for each term jj in subsequent derivations, credit may still be awarded.)*
  • Independence and Measurability Analysis [2 pts]
  • State that the event {Tj}\{T \ge j\} (or its complement {T<j}\{T < j\}) belongs to the σ\sigma-field Fj1\mathcal{F}_{j-1}, or state that I{Tj}I_{\{T \ge j\}} and XjX_j are independent. [1 pt]
  • Use independence to factor the expectation: E[XjI{Tj}]=E[Xj]P(Tj)E[X_j I_{\{T \ge j\}}] = E[X_j] P(T \ge j) (or perform a similar factorization under absolute values / conditional expectation). [1 pt]
  • Integrability Verification (Fubini Justification) [2 pts]
  • Key step (heavy lifting): Prove absolute convergence of the series to justify interchanging expectation EE and summation \sum.
  • Use the condition supnEXnC\sup_n E|X_n| \le C to establish the inequality: E[XjI{Tj}]CP(Tj)E[\sum |X_j| I_{\{T \ge j\}}] \le C \sum P(T \ge j). [1 pt]
  • Invoke ET=P(Tj)<E T = \sum P(T \ge j) < \infty to conclude finiteness of the sum, thereby establishing the applicability of the Fubini theorem or the dominated convergence theorem. [1 pt]
  • Computation and Conclusion [2 pts]
  • Based on the above verification, legitimately interchange summation and expectation: E[j=1TXj]=j=1E[Xj]P(Tj)E[\sum_{j=1}^{T} X_j] = \sum_{j=1}^{\infty} E[X_j] P(T \ge j). [1 pt]
  • Substitute EXj=0E X_j = 0 to obtain the final result 00. [1 pt]

Total (max 7)

2. Zero-credit items

  • Merely reciting "Wald's Identity" or its name without proving it under the specific conditions of this problem (non-identically distributed, independent of the σ\sigma-field filtration).
  • Merely copying the problem conditions (e.g., EXn=0,ET<E X_n=0, E T<\infty) without any derivation.
  • Simply asserting that "linearity of expectation" applies to the random-length sum TT without any series expansion or convergence analysis.

3. Deductions

  • Logical gap:
  • Failing to mention "absolute convergence," "nonnegativity," or "Fubini's theorem" while directly interchanging the infinite series and expectation (even if the computation is correct, omitting the verification in Step 3 constitutes a logical gap), deduct 1 pt.
  • Incorrect assumption:
  • Incorrectly assuming {Xn}\{X_n\} are identically distributed (i.i.d.) and relying on EXn=EX1E|X_n| = E|X_1| in the proof (the problem only provides a uniform bound), deduct 1 pt.
  • Notational and conceptual errors:
  • Confusing random variables with constants (e.g., writing E[T]E[\sum^{T} \dots] as E[T]\sum^{E[T]} \dots), deduct 2 pts.
  • Messy notation, e.g., keeping the random variable TT as the summation upper limit outside the expectation without introducing indicator functions, rendering the mathematical expression meaningless, deduct 1 pt.
Ask AI ✨