Understanding the mathematical foundation and classification of random variables
Random variable transforms random outcomes into numerical values for mathematical analysis
The fundamental tool for describing probability distributions
CDF is non-decreasing
Continuous from the right
Bounds at infinity
Sum of probabilities up to x
Integral of PDF; PDF is derivative of CDF
Essential discrete probability distributions and their applications
p ∈ (0,1) (success probability)
Single trial success/failure experiment
n ∈ ℕ (trials), p ∈ (0,1) (success probability)
Number of successes in n independent Bernoulli trials
λ > 0 (rate parameter)
Rare events occurrence (defects, arrivals, accidents)
p ∈ (0,1) (success probability)
Number of trials until first success
n (sample size), M (success states), N (population size)
Sampling without replacement (defective items, card draws)
Fundamental continuous probability distributions and their properties
Equal probability over an interval (random timing, rounding errors)
Natural phenomena (heights, measurement errors, test scores)
Lifetime modeling, waiting times between events
Sum of independent exponential variables, reliability modeling
Sum of squares of independent standard normal variables
Understanding joint distributions and independence of multiple random variables
Methods for determining the distribution of functions of random variables
Essential distributions for statistical inference
Step-by-step solutions to typical random variable problems
In 10 independent trials with success probability 0.3, find P(X = 4)
Binomial distribution applies to fixed number of independent trials with constant success probability
If X ~ N(100, 16), find P(96 < X < 108)
Standardization allows us to use the standard normal table for any normal distribution
For X ~ Exp(λ), prove that P(X > s+t|X > s) = P(X > t)
Memoryless property means past waiting time doesn't affect future waiting time
PMF is used for discrete random variables and gives exact point probabilities P(X = x). PDF is used for continuous random variables and gives density, so probabilities come from integration over intervals and P(X = x) = 0.
Match the context to the defining mechanism: fixed independent trials suggest Binomial, time until first success suggests Geometric, rare counts suggest Poisson, waiting times suggest Exponential, and measurement-type phenomena often suggest Normal.
Random variables X and Y are independent if knowing one gives no information about the other. In formulas, the joint distribution factors into the product of marginals: P(X=x, Y=y) = P(X=x)P(Y=y) in the discrete case, or p(x,y) = pX(x)pY(y) in the continuous case.
These sampling distributions describe how common statistics behave under repeated sampling. They support variance inference, small-sample mean inference, ANOVA, and many classical confidence intervals and hypothesis tests.
For discrete variables, sum probabilities across all original values that map to the transformed value. For continuous monotone transforms, use the inverse function and derivative. For non-monotone or multivariate transforms, use CDF arguments or a Jacobian.