MathIsimple

Probability Theory – Problem 51: Prove that the random variable converges in distribution to

Question

The random variable XnX_{n} converges in distribution to XX, and YnY_{n} converges in distribution to a positive constant cc. Prove that the random variable XnYnX_{n}Y_{n} converges in distribution to cXc X.

Step-by-step solution

Step 1. By hypothesis, YnY_n converges in distribution to the constant cc, i.e., YndcY_n \xrightarrow{d} c. By a well-known property in probability theory, when a sequence of random variables converges in distribution to a constant, it also converges in probability to that constant. Hence YnpcY_n \xrightarrow{p} c. Step 2. Consider the sequence of two-dimensional random vectors (Xn,Yn)(X_n, Y_n). We have XndXX_n \xrightarrow{d} X and YnpcY_n \xrightarrow{p} c. By the preliminary lemma of Slutsky's theorem (or the convergence theorem for multidimensional random variables), when one component converges in distribution to a random variable and the other converges in probability to a constant, their joint distribution converges in distribution to the vector formed by the respective limits, i.e., (Xn,Yn)d(X,c)(X_n, Y_n) \xrightarrow{d} (X, c). Step 3. Define the bivariate continuous function g(x,y)=xyg(x, y) = xy, which is continuous everywhere on R2\mathbb{R}^2. By the Continuous Mapping Theorem, if a sequence of random vectors ZndZZ_n \xrightarrow{d} Z and the function gg is almost surely continuous on the support of ZZ, then g(Zn)dg(Z)g(Z_n) \xrightarrow{d} g(Z). Step 4. Let Zn=(Xn,Yn)Z_n = (X_n, Y_n) and Z=(X,c)Z = (X, c). Applying the above theorem, we have g(Xn,Yn)dg(X,c)g(X_n, Y_n) \xrightarrow{d} g(X, c). Substituting the function expression yields XnYndcXX_n Y_n \xrightarrow{d} cX.

Final answer

QED.

Marking scheme

The following is the detailed grading rubric for this problem (maximum 7 points).


1. Checkpoints (max 7 pts total)

Select exactly one of the following three paths that fully matches the student's approach; do not combine points across paths.

Chain A: Continuous Mapping Theorem Path (Official Solution)

  • Convergence in probability conversion [2 pts]: Explicitly stating or proving that since YnY_n converges in distribution to the constant cc, it follows that YnY_n converges in probability to cc (YnpcY_n \xrightarrow{p} c).
  • *Note: If the key condition "constant" is not mentioned, causing a logical gap, no credit for this item.*
  • Joint distribution convergence [2 pts]: Using XndXX_n \xrightarrow{d} X and YnpcY_n \xrightarrow{p} c to conclude that the two-dimensional random vector converges in distribution to (X,c)(X, c), i.e., (Xn,Yn)d(X,c)(X_n, Y_n) \xrightarrow{d} (X, c).
  • Continuous Mapping Theorem (CMT) application [3 pts]:
  • Constructing the function g(x,y)=xyg(x,y) = xy and noting its continuity (on R2\mathbb{R}^2 or on the support of the limit) [1 pt].
  • Applying the Continuous Mapping Theorem to obtain g(Xn,Yn)dg(X,c)g(X_n, Y_n) \xrightarrow{d} g(X, c), i.e., XnYndcXX_n Y_n \xrightarrow{d} cX [2 pts].
  • *Note: If only the conclusion XnYndcXX_n Y_n \xrightarrow{d} cX is stated without mentioning continuity or the mapping theorem, no credit for this step.*

Chain B: Direct Citation of Slutsky's Theorem Path

  • Condition verification [3 pts]: Explicitly stating that Slutsky's theorem requires one variable to converge in probability to a constant, and deriving or asserting YnpcY_n \xrightarrow{p} c from the hypothesis YndcY_n \xrightarrow{d} c.
  • *Key point: The student must demonstrate awareness of the distinction between "convergence in distribution" and "convergence in probability"; they cannot treat the two as equivalent by default.*
  • Theorem application [4 pts]: Accurately citing Slutsky's theorem (product form) to directly conclude XnYndcXX_n Y_n \xrightarrow{d} cX.

Chain C: Characteristic Functions or First-Principles Approach

  • Convergence in probability conversion [2 pts]: Obtaining YnpcY_n \xrightarrow{p} c.
  • Analytical proof [5 pts]: Using characteristic functions to decompose and estimate, or rigorously proving the convergence of the product via probability metric inequalities.
  • *If the argument contains serious logical flaws (e.g., incorrectly interchanging limits), this part receives 0 points.*

Total (max 7)


2. Zero-credit items

  • Merely copying the problem conditions (e.g., "XnX,YncX_n \to X, Y_n \to c").
  • Asserting the conclusion based solely on deterministic calculus limit rules ("the limit of a product equals the product of the limits") without citing any probabilistic limit theorem (such as Slutsky or CMT).
  • Claiming YncY_n \to c is "almost sure convergence" without proof (the problem only gives convergence in distribution; this implication is incorrect).

3. Deductions

  • Incorrectly assuming independence: If the proof assumes XnX_n and YnY_n are mutually independent (a condition not given in the problem), and this assumption is central to the proof (e.g., directly factoring the joint distribution probability P(AB)=P(A)P(B)P(AB)=P(A)P(B)): score capped at 3/7 (only the first step receives credit).
  • Logical gap: In Chain A/B, if YndcY_n \xrightarrow{d} c is directly used as YnpcY_n \xrightarrow{p} c without any mention of "because the limit is a constant": deduct 1 point.
  • Notation error: Confusing random variables (uppercase XX) with specific values (lowercase xx) in a way that affects semantic understanding: deduct 1 point.
Ask AI ✨