Master the mathematical foundation of probability through random variables. Learn discrete and continuous distributions, joint distributions, independence, and sampling distributions essential for statistical inference.
Understanding the mathematical foundation and classification of random variables
Random variables map sample points ω from the sample space Ω to real numbers, achieving 'quantification of random experiment results'
The fundamental tool for describing probability distributions
The probability never decreases as we move right on the real line
Probability approaches 0 at negative infinity and 1 at positive infinity
The function is continuous from the right at every point
Step function with jumps at discrete values
Smooth function with F'(x) = p(x) at continuity points
Essential discrete probability distributions and their applications
p ∈ (0,1) (success probability)
Single trial success/failure experiment
n ∈ ℕ (trials), p ∈ (0,1) (success probability)
Number of successes in n independent Bernoulli trials
λ > 0 (rate parameter)
Rare events occurrence (defects, arrivals, accidents)
p ∈ (0,1) (success probability)
Number of trials until first success
n (sample size), M (success states), N (population size)
Sampling without replacement (defective items, card draws)
Fundamental continuous probability distributions and their properties
Equal probability over an interval (random timing, rounding errors)
Natural phenomena (heights, measurement errors, test scores)
Lifetime modeling, waiting times between events
Sum of independent exponential variables, reliability modeling
Sum of squares of independent standard normal variables
Joint distributions, independence, and conditional distributions
Marginals:
Marginals:
Essential distributions for statistical inference
Step-by-step solutions to typical random variable problems
In 10 independent trials with success probability 0.3, find P(X = 4)
Binomial distribution applies to fixed number of independent trials with constant success probability
If X ~ N(100, 16), find P(96 < X < 108)
Standardization allows us to use the standard normal table for any normal distribution
For X ~ Exp(λ), prove that P(X > s+t|X > s) = P(X > t)
Memoryless property means past waiting time doesn't affect future waiting time