MathIsimple

Probability Theory – Problem 14: Find the distributions of and

Question

Let X1,X2i.i.d.E(1)X_1 ,X_2 \overset{\text{i.i.d.}}{\sim} \mathcal{E}(1), and let U=X1X1+X2,V=X1+X2\displaystyle U = \frac{X_1}{X_1 + X_2},V = X_1+X_2. Find the distributions of UU and VV. Furthermore, determine whether UU and X1X_1 are independent.

Step-by-step solution

Given that X1,X2X_1, X_2 are independent and identically distributed as the exponential distribution E(1)\mathcal{E}(1), their probability density functions are fX1(x)=ex,x>0f_{X_1}(x) = e^{-x}, \quad x > 0 fX2(y)=ey,y>0f_{X_2}(y) = e^{-y}, \quad y > 0 Since X1,X2X_1, X_2 are mutually independent, their joint probability density function is fX1,X2(x1,x2)=fX1(x1)fX2(x2)=ex1ex2=e(x1+x2)f_{X_1, X_2}(x_1, x_2) = f_{X_1}(x_1) f_{X_2}(x_2) = e^{-x_1} e^{-x_2} = e^{-(x_1+x_2)} where x1>0,x2>0x_1 > 0, x_2 > 0. The transformation is given by: u=x1x1+x2u = \frac{x_1}{x_1 + x_2} v=x1+x2v = x_1 + x_2 We need to find the inverse transformation, i.e., express x1,x2x_1, x_2 in terms of u,vu, v: From v=x1+x2v = x_1 + x_2 and u=x1vu = \frac{x_1}{v}, we obtain x1=uvx_1 = uv. Solving for x2x_2, we get x2=vx1=vuv=v(1u)x_2 = v - x_1 = v - uv = v(1-u). The inverse transformation is x1=uv,x2=v(1u)x_1 = uv, x_2 = v(1-u).

Since x1>0x_1 > 0 and x2>0x_2 > 0, we have v=x1+x2>0v = x_1 + x_2 > 0. Moreover, u=x1x1+x2u = \frac{x_1}{x_1 + x_2}; since x1>0x_1 > 0 and x1+x2>x1x_1+x_2 > x_1, it follows that 0<u<10 < u < 1. Therefore, the support of the new variables (U,V)(U, V) is {(u,v)0<u<1,v>0}\{(u,v) \mid 0 < u < 1, v > 0\}. We compute the Jacobian determinant JJ: J=det(x1ux1vx2ux2v)=det(vuv1u)=v(1u)u(v)=vuv+uv=vJ = \det \begin{pmatrix} \frac{\partial x_1}{\partial u} & \frac{\partial x_1}{\partial v} \\ \frac{\partial x_2}{\partial u} & \frac{\partial x_2}{\partial v} \end{pmatrix} = \det \begin{pmatrix} v & u \\ -v & 1-u \end{pmatrix} = v(1-u) - u(-v) = v - uv + uv = v Since v>0v > 0, the absolute value of the Jacobian is J=v|J| = v.

By the change-of-variables formula for multivariate densities, fU,V(u,v)=fX1,X2(x1(u,v),x2(u,v))Jf_{U,V}(u,v) = f_{X_1,X_2}(x_1(u,v), x_2(u,v)) |J|, we obtain the joint density of (U,V)(U,V): fU,V(u,v)=e(uv+v(1u))v=evvf_{U,V}(u,v) = e^{-(uv + v(1-u))} \cdot v = e^{-v} \cdot v defined on 0<u<1,v>00 < u < 1, v > 0.

For UU, we integrate the joint density over vv on (0,)(0, \infty): fU(u)=0fU,V(u,v)dv=0vevdvf_U(u) = \int_0^\infty f_{U,V}(u,v) dv = \int_0^\infty v e^{-v} dv This integral equals the Gamma function value Γ(2)=(21)!=1\Gamma(2) = (2-1)! = 1. Hence fU(u)=1f_U(u) = 1 for 0<u<10 < u < 1. This shows that UU follows the uniform distribution on (0,1)(0,1).

For VV, we integrate the joint density over uu on (0,1)(0, 1): fV(v)=01fU,V(u,v)du=01vevdu=vev011du=vevf_V(v) = \int_0^1 f_{U,V}(u,v) du = \int_0^1 v e^{-v} du = v e^{-v} \int_0^1 1 du = v e^{-v} for v>0v > 0. This is the probability density function of the Gamma distribution with shape parameter α=2\alpha=2 and scale parameter β=1\beta=1.

A necessary and sufficient condition for independence of two random variables is that the conditional expectation of one given the other is constant (or equivalently, that the conditional distribution does not depend on the conditioning variable). We compute the conditional expectation of UU given X1=x1X_1=x_1: E[UX1=x1]=E[X1X1+X2  |  X1=x1]=E[x1x1+X2]\mathbb{E}[U \mid X_1=x_1] = \mathbb{E}\left[\frac{X_1}{X_1+X_2} \;\middle|\; X_1=x_1\right] = \mathbb{E}\left[\frac{x_1}{x_1+X_2}\right] Since X2X_2 is a random variable, this expectation is computed with respect to the distribution of X2X_2: E[x1x1+X2]=0x1x1+x2fX2(x2)dx2=0x1x1+x2ex2dx2\mathbb{E}\left[\frac{x_1}{x_1+X_2}\right] = \int_0^\infty \frac{x_1}{x_1+x_2} f_{X_2}(x_2) dx_2 = \int_0^\infty \frac{x_1}{x_1+x_2} e^{-x_2} dx_2 The result of this integral is a function that depends on x1x_1, rather than a constant. For instance, as x10+x_1 \to 0^+, the expectation tends to 0; as x1x_1 \to \infty, the expectation tends to 1. Since the conditional expectation E[UX1=x1]\mathbb{E}[U \mid X_1=x_1] depends on x1x_1, UU and X1X_1 are not independent.

Final answer

The distribution of UU is the uniform distribution on (0,1)(0,1), i.e., UU(0,1)U \sim \mathcal{U}(0,1), with probability density function fU(u)=1,0<u<1f_U(u) = 1, \quad 0 < u < 1. The distribution of VV is the Gamma distribution with parameters α=2,β=1\alpha=2, \beta=1, i.e., VΓ(2,1)V \sim \Gamma(2,1), with probability density function fV(v)=vev,v>0f_V(v) = v e^{-v}, \quad v > 0. UU and X1X_1 are not independent.

Marking scheme

The following is the detailed marking scheme for this probability theory problem (maximum score: 7 points).


I. Checkpoints (max 7 pts)

Part 1: Distributions of UU and VV (5 points total)

  • Change of variables and Jacobian determinant [1 point]
  • Write the correct inverse transformation x1=uv,x2=v(1u)x_1 = uv, x_2 = v(1-u) AND compute the absolute value of the Jacobian J=v|J| = v.
  • *If only the formula is stated without substitution or the result is incorrect, award 0 points.*
  • Joint probability density function of (U,V)(U, V) [2 points]
  • [1 point] Obtain the correct expression fU,V(u,v)=vevf_{U,V}(u,v) = v e^{-v} (or equivalently evve^{-v} \cdot v).
  • [1 point] Explicitly state the correct support: 0<u<10 < u < 1 and v>0v > 0.
  • Marginal distributions of UU and VV [2 points]
  • [1 point] Derive the marginal density fU(u)=1f_U(u) = 1 (or identify it as the uniform distribution U(0,1)U(0,1)), with the range 0<u<10 < u < 1 stated.
  • [1 point] Derive the marginal density fV(v)=vevf_V(v) = v e^{-v} (or identify it as the Γ(2,1)\Gamma(2,1) distribution), with the range v>0v > 0 stated.
  • *Note: If the student directly deduces the marginal distributions from the separable form of the joint density fU,V(u,v)=1vevf_{U,V}(u,v) = 1 \cdot v e^{-v} via the Factorization Theorem, award full marks.*

Part 2: Independence of UU and X1X_1 (2 points total)

  • Reasoning and derivation [1 point]
  • Establish a valid basis for the determination via any of the following approaches:
  • Approach A (Conditional expectation/probability): Set up the expression for E[UX1]\mathbb{E}[U|X_1] or P(UuX1)P(U \le u | X_1).
  • Approach B (Joint distribution): Derive the joint distribution of (U,X1)(U, X_1) and compare it with the product of the marginal distributions.
  • Approach C (Qualitative analysis): Use limiting arguments (e.g., as X10X_1 \to 0 vs.\ X1X_1 \to \infty, the behavior of UU differs) to show that the conditional distribution depends on X1X_1.
  • *Merely asserting that "the formula contains X1X_1" as an intuitive argument without mathematical justification earns 0 points for this item.*
  • Conclusion [1 point]
  • Based on the above reasoning, correctly conclude that "UU and X1X_1 are not independent."

II. Zero-Credit Items

  • Merely copying the density formulas of X1,X2X_1, X_2 or the definition of independence without performing any change of variables or computation.
  • In Part 2, merely asserting "not independent" without providing any justification or computation (guessing the answer).
  • Claiming "since UU and VV are independent, therefore UU and X1X_1 are independent/not independent" (logically irrelevant).

III. Deductions

  • Missing domain/support (-1 point): When presenting the final probability density functions (joint or marginal), completely failing to specify the range of the variables (e.g., u(0,1)u \in (0,1) or v>0v>0). *At most 1 point deducted across the entire paper.*
  • Computational error leading to reversed conclusion (-1 point): For example, an error in the Jacobian leading to a divergent integral or a normalization constant not equal to 1, but with the overall method and procedure being correct.
  • Logical confusion (-1 point): In the independence determination, confusing random variables (uppercase) with specific values (lowercase), resulting in meaningless expressions.

Total score verification (Total max 7): \_\_\_\_\_\_\_ / 7

Ask AI ✨