Question
Let , and let . Find the distributions of and . Furthermore, determine whether and are independent.
Step-by-step solution
Given that are independent and identically distributed as the exponential distribution , their probability density functions are Since are mutually independent, their joint probability density function is where . The transformation is given by: We need to find the inverse transformation, i.e., express in terms of : From and , we obtain . Solving for , we get . The inverse transformation is .
Since and , we have . Moreover, ; since and , it follows that . Therefore, the support of the new variables is . We compute the Jacobian determinant : Since , the absolute value of the Jacobian is .
By the change-of-variables formula for multivariate densities, , we obtain the joint density of : defined on .
For , we integrate the joint density over on : This integral equals the Gamma function value . Hence for . This shows that follows the uniform distribution on .
For , we integrate the joint density over on : for . This is the probability density function of the Gamma distribution with shape parameter and scale parameter .
A necessary and sufficient condition for independence of two random variables is that the conditional expectation of one given the other is constant (or equivalently, that the conditional distribution does not depend on the conditioning variable). We compute the conditional expectation of given : Since is a random variable, this expectation is computed with respect to the distribution of : The result of this integral is a function that depends on , rather than a constant. For instance, as , the expectation tends to 0; as , the expectation tends to 1. Since the conditional expectation depends on , and are not independent.
Final answer
The distribution of is the uniform distribution on , i.e., , with probability density function . The distribution of is the Gamma distribution with parameters , i.e., , with probability density function . and are not independent.
Marking scheme
The following is the detailed marking scheme for this probability theory problem (maximum score: 7 points).
I. Checkpoints (max 7 pts)
Part 1: Distributions of and (5 points total)
- Change of variables and Jacobian determinant [1 point]
- Write the correct inverse transformation AND compute the absolute value of the Jacobian .
- *If only the formula is stated without substitution or the result is incorrect, award 0 points.*
- Joint probability density function of [2 points]
- [1 point] Obtain the correct expression (or equivalently ).
- [1 point] Explicitly state the correct support: and .
- Marginal distributions of and [2 points]
- [1 point] Derive the marginal density (or identify it as the uniform distribution ), with the range stated.
- [1 point] Derive the marginal density (or identify it as the distribution), with the range stated.
- *Note: If the student directly deduces the marginal distributions from the separable form of the joint density via the Factorization Theorem, award full marks.*
Part 2: Independence of and (2 points total)
- Reasoning and derivation [1 point]
- Establish a valid basis for the determination via any of the following approaches:
- Approach A (Conditional expectation/probability): Set up the expression for or .
- Approach B (Joint distribution): Derive the joint distribution of and compare it with the product of the marginal distributions.
- Approach C (Qualitative analysis): Use limiting arguments (e.g., as vs.\ , the behavior of differs) to show that the conditional distribution depends on .
- *Merely asserting that "the formula contains " as an intuitive argument without mathematical justification earns 0 points for this item.*
- Conclusion [1 point]
- Based on the above reasoning, correctly conclude that " and are not independent."
II. Zero-Credit Items
- Merely copying the density formulas of or the definition of independence without performing any change of variables or computation.
- In Part 2, merely asserting "not independent" without providing any justification or computation (guessing the answer).
- Claiming "since and are independent, therefore and are independent/not independent" (logically irrelevant).
III. Deductions
- Missing domain/support (-1 point): When presenting the final probability density functions (joint or marginal), completely failing to specify the range of the variables (e.g., or ). *At most 1 point deducted across the entire paper.*
- Computational error leading to reversed conclusion (-1 point): For example, an error in the Jacobian leading to a divergent integral or a normalization constant not equal to 1, but with the overall method and procedure being correct.
- Logical confusion (-1 point): In the independence determination, confusing random variables (uppercase) with specific values (lowercase), resulting in meaningless expressions.
Total score verification (Total max 7): \_\_\_\_\_\_\_ / 7