MathIsimple

Probability Theory – Problem 77: Prove that the random variables converge in distribution to

Question

The random variables XnX_{n} converge in distribution to XX, and YnY_{n} converge in distribution to a positive constant cc. Prove that the random variables XnYnX_{n}Y_{n} converge in distribution to cXcX.

Step-by-step solution

Step 1. By hypothesis, YnY_n converges in distribution to the constant cc, i.e., YndcY_n \xrightarrow{d} c. By a standard result in probability theory, when a sequence of random variables converges in distribution to a constant, it also converges in probability to that constant. Hence YnpcY_n \xrightarrow{p} c. Step 2. Consider the sequence of bivariate random vectors (Xn,Yn)(X_n, Y_n). We have XndXX_n \xrightarrow{d} X and YnpcY_n \xrightarrow{p} c. By the prerequisite lemma of Slutsky's theorem (or the convergence theorem for multidimensional random variables), when one component converges in distribution to a random variable and the other converges in probability to a constant, their joint distribution converges in distribution to the vector formed by the respective limits, i.e., (Xn,Yn)d(X,c)(X_n, Y_n) \xrightarrow{d} (X, c). Step 3. Define the bivariate continuous function g(x,y)=xyg(x, y) = xy, which is continuous everywhere on R2\mathbb{R}^2. By the Continuous Mapping Theorem, if a sequence of random vectors ZndZZ_n \xrightarrow{d} Z and the function gg is continuous almost everywhere on the support of ZZ, then g(Zn)dg(Z)g(Z_n) \xrightarrow{d} g(Z). Step 4. Let Zn=(Xn,Yn)Z_n = (X_n, Y_n) and Z=(X,c)Z = (X, c). Applying the above theorem, g(Xn,Yn)dg(X,c)g(X_n, Y_n) \xrightarrow{d} g(X, c). Substituting the function expression yields XnYndcXX_n Y_n \xrightarrow{d} cX.

Final answer

QED.

Marking scheme

The following is the detailed marking scheme for this problem (total: 7 points).


1. Checkpoints (max 7 pts total)

Choose one of the following three paths that fully matches the student's approach; do not accumulate points across paths.

Chain A: Continuous Mapping Theorem Path (Official Solution)

  • Convergence in probability conversion [2 pts]: Explicitly state or prove: since YnY_n converges in distribution to a constant cc, it follows that YnY_n converges in probability to cc (YnpcY_n \xrightarrow{p} c).
  • *Note: If the key condition "constant" is not mentioned, causing a logical gap, no credit is awarded for this item.*
  • Joint distributional convergence [2 pts]: Using XndXX_n \xrightarrow{d} X and YnpcY_n \xrightarrow{p} c, conclude that the bivariate random vector converges in distribution to (X,c)(X, c), i.e., (Xn,Yn)d(X,c)(X_n, Y_n) \xrightarrow{d} (X, c).
  • Continuous Mapping Theorem (CMT) application [3 pts]:
  • Construct the function g(x,y)=xyg(x,y) = xy and state its continuity (on R2\mathbb{R}^2 or on the support of the limit) [1 pt].
  • Apply the Continuous Mapping Theorem to conclude g(Xn,Yn)dg(X,c)g(X_n, Y_n) \xrightarrow{d} g(X, c), i.e., XnYndcXX_n Y_n \xrightarrow{d} cX [2 pts].
  • *Note: Simply writing the conclusion XnYndcXX_n Y_n \xrightarrow{d} cX without mentioning continuity or the mapping theorem earns no credit for this step.*

Chain B: Direct Slutsky's Theorem Path

  • Condition verification [3 pts]: Explicitly state that Slutsky's theorem requires one variable to converge in probability to a constant, and derive or declare YnpcY_n \xrightarrow{p} c from the hypothesis YndcY_n \xrightarrow{d} c.
  • *Key point: The student must demonstrate awareness of the distinction between "convergence in distribution" and "convergence in probability"; they cannot assume the two are equivalent by default.*
  • Theorem application [4 pts]: Accurately cite Slutsky's theorem (product form) to directly conclude XnYndcXX_n Y_n \xrightarrow{d} cX.

Chain C: Characteristic Function or First-Principles Approach

  • Convergence in probability conversion [2 pts]: Obtain YnpcY_n \xrightarrow{p} c.
  • Analytic proof [5 pts]: Use characteristic functions to decompose and estimate, or rigorously prove convergence of the product using probability metric inequalities.
  • *If the argument contains serious logical flaws (e.g., incorrectly interchanging limits), this part receives 0 points.*

Total (max 7)


2. Zero-credit items

  • Merely copying the problem conditions (e.g., "XnX,YncX_n \to X, Y_n \to c").
  • Asserting the conclusion based solely on deterministic calculus limit rules ("the limit of a product equals the product of the limits") without citing any probabilistic limit theorem (such as Slutsky or CMT).
  • Claiming YncY_n \to c is "almost sure convergence" without proof (the problem only gives convergence in distribution; this is an incorrect implication).

3. Deductions

  • Incorrectly assuming independence: If the proof assumes XnX_n and YnY_n are mutually independent (the problem does not state this), and this assumption is central to the proof (e.g., directly factoring the joint probability P(AB)=P(A)P(B)P(AB)=P(A)P(B)): score capped at 3/7 (only the first step receives credit).
  • Logical gap: In Chain A/B, directly using YndcY_n \xrightarrow{d} c as YnpcY_n \xrightarrow{p} c without mentioning "because the limit is a constant": deduct 1 point.
  • Notation error: Confusing random variables (uppercase XX) with specific values (lowercase xx) in a way that affects semantic understanding: deduct 1 point.
Ask AI ✨