Question
For random variables , define
(1) Prove that defines a metric on the space of random variables. That is, prove that satisfies positive definiteness, symmetry, and the triangle inequality.
(2) Prove that the random variables converge to in probability if and only if .
Step-by-step solution
Step 1. First, prove that satisfies the definition of a metric. (i) Non-negativity and positive definiteness: Since , the integrand , so after taking expectations. If , then the non-negative random variable has expectation 0, which means it equals 0 almost surely, hence a.s., i.e., a.s. (ii) Symmetry: Clearly , so . (iii) Triangle inequality: Consider the function . For , the derivative , so is monotonically increasing. For any real numbers , by the absolute value inequality, . Let , so . Using the monotonicity of and the inequality , we obtain: . Taking expectations on both sides yields . In summary, defines a metric on the space of random variables (in the sense of almost sure equality).
Step 2. Proof of sufficiency: If , then . For any given , consider the event . On this event, by the monotonicity of , we have . By a generalization of Chebyshev's inequality or directly using properties of expectation: . Rearranging: . As , since , we get . That is, converges to in probability.
Step 3. Proof of necessity: If , then . For any given , split the expectation into two parts: . For the first term, the integrand is always at most 1, so the first term is at most . For the second term, when , , so the second term is at most . That is, . Since , as , . Therefore . Since is arbitrary, .
Final answer
QED.
Marking scheme
The following is the rubric based on the official solution (maximum 7 points).
1. Checkpoints (Total max 7)
Part 1: Prove that is a metric (max 2 pts)
- Positive definiteness and symmetry [additive]
- State that and a.s. (equal almost surely), and briefly explain symmetry .
- 1 pt
- Triangle inequality [additive]
- Use the monotonicity or subadditivity of the function (i.e., ) to derive .
- *If only the triangle inequality formula is stated without proving the core algebraic inequality, no credit is awarded.*
- 1 pt
Part 2: Prove the equivalence with convergence in probability (max 5 pts)
- Sufficiency proof: [additive]
- Use Chebyshev's inequality (or Markov's inequality) to establish the connection between and .
- Core logic: derive , or note that on the event the integrand has the lower bound .
- 2 pts
- Necessity proof:
- Score exactly one chain | take the maximum subtotal among chains; do not add points across chains.
- `Chain A (Truncation/decomposition method)`
- Decompose the expectation: Split into integrals over and (or a similar approach). [1 pt]
- Bounding and taking limits: Correctly bound both parts (first part , second part ), and let to show the limit is 0. [2 pts]
- `Chain B (Convergence theorem method)`
- Transfer of convergence in probability: State that . [1 pt]
- Citing a theorem: Invoke the Dominated Convergence Theorem (DCT) (dominated by 1) or the Bounded Convergence Theorem to conclude . [2 pts]
Total (max 7)
2. Zero-credit items
- Merely copying the metric definition formula or the definition of convergence in probability from the problem, without performing any specific derivation.
- When proving the triangle inequality, directly asserting that implies the inequality for expectations, without addressing the effect of the denominator .
- In Part 2, merely stating "convergence of expectation implies convergence in probability" or vice versa, without proving it specifically for the nonlinear metric .
3. Deductions
- Ignoring almost sure equality (a.s.): When proving positive definiteness, if it is not stated that holds only in the "almost sure" sense (or ), deduct 1 point.
- Confusing convergence concepts: In the necessity proof, if convergence is incorrectly assumed to be pointwise or almost sure convergence to directly interchange limits and integrals, without mentioning subsequences or properties of convergence in probability, deduct 1 point (in Chain B, if DCT is used, the version under convergence in probability must be made explicit or the subsequence principle must be invoked; otherwise this is treated as a logical gap).
- Circular reasoning: Using the conclusion to be proved as a basis in the proof (e.g., directly using the fact that is a metric to prove convergence), that part receives 0 points.