MathIsimple

Probability Theory – Problem 22: Prove that the random variables converge in distribution to

Question

Suppose the random variables XnX_{n} converge in distribution to XX, and YnY_{n} converge in distribution to a positive constant cc. Prove that the random variables XnYnX_{n}Y_{n} converge in distribution to cXcX.

Step-by-step solution

Step 1. By hypothesis, YnY_n converges in distribution to the constant cc, i.e., YndcY_n \xrightarrow{d} c. By a standard result in probability theory, whenever a sequence of random variables converges in distribution to a constant, it also converges in probability to that constant. Hence YnpcY_n \xrightarrow{p} c. Step 2. Consider the sequence of bivariate random vectors (Xn,Yn)(X_n, Y_n). We have XndXX_n \xrightarrow{d} X and YnpcY_n \xrightarrow{p} c. By the preliminary lemma underlying Slutsky's theorem (or by the joint convergence theorem for random vectors), when one component converges in distribution to a random variable and the other converges in probability to a constant, the joint distribution converges in distribution to the vector formed by their respective limits, i.e., (Xn,Yn)d(X,c)(X_n, Y_n) \xrightarrow{d} (X, c). Step 3. Define the bivariate continuous function g(x,y)=xyg(x, y) = xy, which is continuous everywhere on R2\mathbb{R}^2. By the Continuous Mapping Theorem, if a sequence of random vectors ZndZZ_n \xrightarrow{d} Z and gg is continuous almost everywhere on the support of ZZ, then g(Zn)dg(Z)g(Z_n) \xrightarrow{d} g(Z). Step 4. Set Zn=(Xn,Yn)Z_n = (X_n, Y_n) and Z=(X,c)Z = (X, c). Applying the above theorem yields g(Xn,Yn)dg(X,c)g(X_n, Y_n) \xrightarrow{d} g(X, c). Substituting the expression for gg gives XnYndcXX_n Y_n \xrightarrow{d} cX.

Final answer

QED.

Marking scheme

The following is the detailed marking scheme for this problem (maximum 7 points).


1. Checkpoints (max 7 pts total)

Select exactly one of the following three chains that matches the student's approach; do not combine points across chains.

Chain A: Continuous Mapping Theorem Path (Official Solution)

  • Conversion to convergence in probability [2 pts]: Explicitly state or prove that since YnY_n converges in distribution to the constant cc, it follows that YnY_n converges in probability to cc (YnpcY_n \xrightarrow{p} c).
  • *Note: If the key condition that the limit is a constant is not mentioned, resulting in a logical gap, no credit is awarded for this item.*
  • Joint distributional convergence [2 pts]: Using XndXX_n \xrightarrow{d} X and YnpcY_n \xrightarrow{p} c, conclude that the bivariate random vector converges in distribution to (X,c)(X, c), i.e., (Xn,Yn)d(X,c)(X_n, Y_n) \xrightarrow{d} (X, c).
  • Application of the Continuous Mapping Theorem (CMT) [3 pts]:
  • Construct the function g(x,y)=xyg(x,y) = xy and note its continuity (on R2\mathbb{R}^2 or on the support of the limit) [1 pt].
  • Apply the CMT to obtain g(Xn,Yn)dg(X,c)g(X_n, Y_n) \xrightarrow{d} g(X, c), i.e., XnYndcXX_n Y_n \xrightarrow{d} cX [2 pts].
  • *Note: If the student merely states the conclusion XnYndcXX_n Y_n \xrightarrow{d} cX without mentioning continuity or the mapping theorem, no credit is awarded for this step.*

Chain B: Direct Application of Slutsky's Theorem

  • Verification of conditions [3 pts]: Explicitly state that Slutsky's theorem requires one of the variables to converge in probability to a constant, and derive or assert YnpcY_n \xrightarrow{p} c from the hypothesis YndcY_n \xrightarrow{d} c.
  • *Key point: The student must demonstrate awareness of the distinction between convergence in distribution and convergence in probability; the two cannot be treated as equivalent by default.*
  • Application of the theorem [4 pts]: Correctly cite Slutsky's theorem (product form) to directly conclude XnYndcXX_n Y_n \xrightarrow{d} cX.

Chain C: Characteristic Functions or First-Principles Approach

  • Conversion to convergence in probability [2 pts]: Establish YnpcY_n \xrightarrow{p} c.
  • Analytical proof [5 pts]: Use characteristic functions to decompose and estimate, or employ probability metric inequalities to rigorously prove convergence of the product.
  • *If the argument contains a serious logical flaw (e.g., incorrectly interchanging limits), award 0 points for this part.*

Total (max 7)


2. Zero-credit items

  • Merely copying the hypotheses (e.g., "XnX,YncX_n \to X, Y_n \to c").
  • Asserting the conclusion based solely on deterministic calculus limit rules ("the limit of a product equals the product of the limits") without invoking any probabilistic limit theorem (such as Slutsky's theorem or the CMT).
  • Claiming that YncY_n \to c holds almost surely without proof (the problem only gives convergence in distribution; this implication is incorrect).

3. Deductions

  • Incorrect independence assumption: If the proof assumes that XnX_n and YnY_n are independent (a condition not given in the problem), and this assumption is central to the argument (e.g., directly factoring the joint probability as P(AB)=P(A)P(B)P(AB)=P(A)P(B)): cap the score at 3/7 (only the first step receives credit).
  • Logical gap: In Chain A/B, if the student directly uses YndcY_n \xrightarrow{d} c as YnpcY_n \xrightarrow{p} c without mentioning that the limit is a constant: deduct 1 point.
  • Notational error: Confusing random variables (uppercase XX) with deterministic values (lowercase xx) in a way that affects semantic clarity: deduct 1 point.
Ask AI ✨