MathIsimple
Formula Reference

Random Variables & Distributions Formulas

Comprehensive collection of essential formulas for random variables, probability distributions, expectation, variance, and sampling distributions

7 Categories52 Formulas
Random Variable Fundamentals
Basic definitions and properties of random variables
4 formulas

Random Variable Definition

ξ:ΩR,ξ1(B)={ω:ξ(ω)B}F\xi: \Omega \to \mathbb{R}, \quad \xi^{-1}(B) = \{\omega : \xi(\omega) \in B\} \in \mathscr{F}

Maps sample space Ω to real numbers, measurable with respect to σ-field ℱ

Distribution Function (CDF)

F(x)=P(ξx),xRF(x) = P(\xi \leq x), \quad x \in \mathbb{R}

Cumulative distribution function gives probability that random variable ≤ x

CDF Properties

Monotonic: F(a)F(b) if abLimits: F()=0,F(+)=1Right continuous: F(x+)=F(x)\text{Monotonic: } F(a) \leq F(b) \text{ if } a \leq b \\ \text{Limits: } F(-\infty) = 0, F(+\infty) = 1 \\ \text{Right continuous: } F(x^+) = F(x)

Essential properties that any valid CDF must satisfy

Probability Intervals

P(a<ξb)=F(b)F(a)P(a < \xi \leq b) = F(b) - F(a)

Calculate probability of interval using CDF difference

Discrete Random Variables
Probability mass functions and discrete distributions
7 formulas

Probability Mass Function

P(ξ=xi)=p(xi),ip(xi)=1P(\xi = x_i) = p(x_i), \quad \sum_{i} p(x_i) = 1

PMF gives probability at each discrete value, must sum to 1

Discrete CDF

F(x)=xixp(xi)F(x) = \sum_{x_i \leq x} p(x_i)

CDF is sum of PMF values up to x (step function)

Bernoulli Distribution

ξBer(p):P(ξ=k)=pk(1p)1k,k{0,1}\xi \sim \text{Ber}(p): \quad P(\xi = k) = p^k(1-p)^{1-k}, \quad k \in \{0,1\}

Single trial success/failure with success probability p

Binomial Distribution

ξB(n,p):P(ξ=k)=(nk)pk(1p)nk\xi \sim B(n,p): \quad P(\xi = k) = \binom{n}{k} p^k (1-p)^{n-k}

Number of successes in n independent Bernoulli trials

Poisson Distribution

ξP(λ):P(ξ=k)=λkeλk!,k=0,1,2,\xi \sim P(\lambda): \quad P(\xi = k) = \frac{\lambda^k e^{-\lambda}}{k!}, \quad k = 0,1,2,\ldots

Models rare events occurrence with rate parameter λ

Geometric Distribution

ξGeo(p):P(ξ=k)=(1p)k1p,k=1,2,3,\xi \sim \text{Geo}(p): \quad P(\xi = k) = (1-p)^{k-1}p, \quad k = 1,2,3,\ldots

Number of trials until first success, memoryless property

Hypergeometric Distribution

ξH(n,M,N):P(ξ=k)=(Mk)(NMnk)(Nn)\xi \sim H(n,M,N): \quad P(\xi = k) = \frac{\binom{M}{k}\binom{N-M}{n-k}}{\binom{N}{n}}

Sampling without replacement: k successes in n draws from N items with M successes

Continuous Random Variables
Probability density functions and continuous distributions
9 formulas

Probability Density Function

p(x)0,p(x)dx=1p(x) \geq 0, \quad \int_{-\infty}^{\infty} p(x) dx = 1

PDF is non-negative and integrates to 1, P(X=c) = 0 for continuous variables

Continuous CDF

F(x)=xp(t)dt,p(x)=F(x) (where F is differentiable)F(x) = \int_{-\infty}^x p(t) dt, \quad p(x) = F'(x) \text{ (where } F \text{ is differentiable)}

CDF is integral of PDF, PDF is derivative of CDF

Probability Intervals

P(a<ξ<b)=P(aξb)=abp(x)dx=F(b)F(a)P(a < \xi < b) = P(a \leq \xi \leq b) = \int_a^b p(x) dx = F(b) - F(a)

Probability of interval equals area under PDF or CDF difference

Uniform Distribution

ξU(a,b):p(x)={1baa<x<b0otherwise\xi \sim U(a,b): \quad p(x) = \begin{cases} \frac{1}{b-a} & a < x < b \\ 0 & \text{otherwise} \end{cases}

Constant probability density over interval [a,b]

Normal Distribution

ξN(μ,σ2):p(x)=12πσe(xμ)22σ2\xi \sim N(\mu, \sigma^2): \quad p(x) = \frac{1}{\sqrt{2\pi}\sigma} e^{-\frac{(x-\mu)^2}{2\sigma^2}}

Bell-shaped distribution with mean μ and variance σ²

Standard Normal

ZN(0,1):p(z)=12πez22,Φ(z)=zp(t)dtZ \sim N(0,1): \quad p(z) = \frac{1}{\sqrt{2\pi}} e^{-\frac{z^2}{2}}, \quad \Phi(z) = \int_{-\infty}^z p(t) dt

Standardized normal distribution, Φ(z) is standard normal CDF

Normal Standardization

If ξN(μ,σ2), then Z=ξμσN(0,1)\text{If } \xi \sim N(\mu, \sigma^2), \text{ then } Z = \frac{\xi - \mu}{\sigma} \sim N(0,1)

Transform any normal distribution to standard normal

Exponential Distribution

ξExp(λ):p(x)={λeλxx>00x0\xi \sim \text{Exp}(\lambda): \quad p(x) = \begin{cases} \lambda e^{-\lambda x} & x > 0 \\ 0 & x \leq 0 \end{cases}

Models waiting times and lifetimes, memoryless property

Gamma Distribution

ξΓ(α,β):p(x)=βαΓ(α)xα1eβx,x>0\xi \sim \Gamma(\alpha, \beta): \quad p(x) = \frac{\beta^\alpha}{\Gamma(\alpha)} x^{\alpha-1} e^{-\beta x}, \quad x > 0

Generalizes exponential distribution, α is shape, β is rate parameter

Expectation and Moments
Expected values, variance, and moment calculations
8 formulas

Discrete Expectation

E[ξ]=ixip(xi)E[\xi] = \sum_{i} x_i p(x_i)

Expected value for discrete random variable

Continuous Expectation

E[ξ]=xp(x)dxE[\xi] = \int_{-\infty}^{\infty} x p(x) dx

Expected value for continuous random variable

Expectation of Function

E[g(ξ)]={ig(xi)p(xi)discreteg(x)p(x)dxcontinuousE[g(\xi)] = \begin{cases} \sum_i g(x_i) p(x_i) & \text{discrete} \\ \int_{-\infty}^{\infty} g(x) p(x) dx & \text{continuous} \end{cases}

Expected value of function of random variable

Variance Definition

Var(ξ)=E[(ξE[ξ])2]=E[ξ2](E[ξ])2\text{Var}(\xi) = E[(\xi - E[\xi])^2] = E[\xi^2] - (E[\xi])^2

Variance measures spread around the mean

Standard Deviation

σ(ξ)=Var(ξ)\sigma(\xi) = \sqrt{\text{Var}(\xi)}

Standard deviation is square root of variance

Linearity of Expectation

E[aξ+bη+c]=aE[ξ]+bE[η]+cE[a\xi + b\eta + c] = aE[\xi] + bE[\eta] + c

Expectation is linear (holds regardless of independence)

Variance Properties

Var(aξ+b)=a2Var(ξ)Var(ξ+η)=Var(ξ)+Var(η)+2Cov(ξ,η)\text{Var}(a\xi + b) = a^2 \text{Var}(\xi) \\ \text{Var}(\xi + \eta) = \text{Var}(\xi) + \text{Var}(\eta) + 2\text{Cov}(\xi, \eta)

Variance of linear transformation and sum (covariance term for dependence)

k-th Moment

μk=E[ξk]={ixikp(xi)discretexkp(x)dxcontinuous\mu_k = E[\xi^k] = \begin{cases} \sum_i x_i^k p(x_i) & \text{discrete} \\ \int_{-\infty}^{\infty} x^k p(x) dx & \text{continuous} \end{cases}

k-th moment about origin

Joint Distributions
Multidimensional random variables and independence
8 formulas

Joint PMF (Discrete)

P(ξ=xi,η=yj)=pij,ijpij=1P(\xi = x_i, \eta = y_j) = p_{ij}, \quad \sum_i \sum_j p_{ij} = 1

Joint probability mass function for discrete random vector

Joint PDF (Continuous)

p(x,y)0,p(x,y)dxdy=1p(x,y) \geq 0, \quad \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} p(x,y) dx dy = 1

Joint probability density function for continuous random vector

Marginal Distributions

Discrete: P(ξ=xi)=jpijContinuous: pξ(x)=p(x,y)dy\text{Discrete: } P(\xi = x_i) = \sum_j p_{ij} \\ \text{Continuous: } p_\xi(x) = \int_{-\infty}^{\infty} p(x,y) dy

Marginal distributions obtained by summing/integrating joint distribution

Independence Condition

Discrete: pij=pipj for all i,jContinuous: p(x,y)=pξ(x)pη(y) a.e.\text{Discrete: } p_{ij} = p_i p_j \text{ for all } i,j \\ \text{Continuous: } p(x,y) = p_\xi(x) p_\eta(y) \text{ a.e.}

Random variables are independent if joint equals product of marginals

Conditional PMF

P(η=yjξ=xi)=pijpi(pi>0)P(\eta = y_j | \xi = x_i) = \frac{p_{ij}}{p_i} \quad (p_i > 0)

Conditional probability mass function

Conditional PDF

pηξ(yx)=p(x,y)pξ(x)(pξ(x)>0)p_{\eta|\xi}(y|x) = \frac{p(x,y)}{p_\xi(x)} \quad (p_\xi(x) > 0)

Conditional probability density function

Covariance

Cov(ξ,η)=E[(ξE[ξ])(ηE[η])]=E[ξη]E[ξ]E[η]\text{Cov}(\xi, \eta) = E[(\xi - E[\xi])(\eta - E[\eta])] = E[\xi \eta] - E[\xi]E[\eta]

Measures linear dependence between random variables

Correlation Coefficient

ρ(ξ,η)=Cov(ξ,η)Var(ξ)Var(η),1ρ1\rho(\xi, \eta) = \frac{\text{Cov}(\xi, \eta)}{\sqrt{\text{Var}(\xi)\text{Var}(\eta)}}, \quad -1 \leq \rho \leq 1

Standardized measure of linear dependence

Sampling Distributions
Chi-squared, t-distribution, and F-distribution
8 formulas

Chi-squared Distribution

If X1,,XnN(0,1) independent, then Y=i=1nXi2χ2(n)\text{If } X_1, \ldots, X_n \sim N(0,1) \text{ independent, then } Y = \sum_{i=1}^n X_i^2 \sim \chi^2(n)

Sum of squares of independent standard normal variables

Chi-squared PDF

χ2(n):p(x)=(1/2)n/2Γ(n/2)xn/21ex/2,x>0\chi^2(n): \quad p(x) = \frac{(1/2)^{n/2}}{\Gamma(n/2)} x^{n/2-1} e^{-x/2}, \quad x > 0

PDF of chi-squared distribution with n degrees of freedom

Chi-squared Properties

E[χ2(n)]=n,Var[χ2(n)]=2nχ2(n1)+χ2(n2)=χ2(n1+n2) (independence)E[\chi^2(n)] = n, \quad \text{Var}[\chi^2(n)] = 2n \\ \chi^2(n_1) + \chi^2(n_2) = \chi^2(n_1 + n_2) \text{ (independence)}

Mean, variance, and additivity property of chi-squared

t-Distribution Definition

If XN(0,1),Yχ2(n) independent, then T=XY/nt(n)\text{If } X \sim N(0,1), Y \sim \chi^2(n) \text{ independent, then } T = \frac{X}{\sqrt{Y/n}} \sim t(n)

Ratio of standard normal to square root of scaled chi-squared

t-Distribution PDF

t(n):p(t)=Γ((n+1)/2)nπΓ(n/2)(1+t2n)(n+1)/2t(n): \quad p(t) = \frac{\Gamma((n+1)/2)}{\sqrt{n\pi}\Gamma(n/2)} \left(1 + \frac{t^2}{n}\right)^{-(n+1)/2}

PDF of t-distribution, approaches standard normal as n → ∞

F-Distribution Definition

If Xχ2(m),Yχ2(n) independent, then F=X/mY/nF(m,n)\text{If } X \sim \chi^2(m), Y \sim \chi^2(n) \text{ independent, then } F = \frac{X/m}{Y/n} \sim F(m,n)

Ratio of two independent scaled chi-squared variables

F-Distribution PDF

F(m,n):p(f)=Γ((m+n)/2)Γ(m/2)Γ(n/2)mm/2nn/2(mf+n)(m+n)/2fm/21F(m,n): \quad p(f) = \frac{\Gamma((m+n)/2)}{\Gamma(m/2)\Gamma(n/2)} \frac{m^{m/2} n^{n/2}}{(mf+n)^{(m+n)/2}} f^{m/2-1}

PDF of F-distribution with m and n degrees of freedom

Distribution Relationships

If Tt(n), then T2F(1,n)If FF(m,n), then 1/FF(n,m)\text{If } T \sim t(n), \text{ then } T^2 \sim F(1,n) \\ \text{If } F \sim F(m,n), \text{ then } 1/F \sim F(n,m)

Key relationships between t and F distributions

Common Distribution Parameters
Mean and variance formulas for standard distributions
8 formulas

Bernoulli Parameters

ξBer(p):E[ξ]=p,Var(ξ)=p(1p)\xi \sim \text{Ber}(p): \quad E[\xi] = p, \quad \text{Var}(\xi) = p(1-p)

Mean and variance for Bernoulli distribution

Binomial Parameters

ξB(n,p):E[ξ]=np,Var(ξ)=np(1p)\xi \sim B(n,p): \quad E[\xi] = np, \quad \text{Var}(\xi) = np(1-p)

Mean and variance for binomial distribution

Poisson Parameters

ξP(λ):E[ξ]=λ,Var(ξ)=λ\xi \sim P(\lambda): \quad E[\xi] = \lambda, \quad \text{Var}(\xi) = \lambda

Mean equals variance for Poisson distribution

Geometric Parameters

ξGeo(p):E[ξ]=1p,Var(ξ)=1pp2\xi \sim \text{Geo}(p): \quad E[\xi] = \frac{1}{p}, \quad \text{Var}(\xi) = \frac{1-p}{p^2}

Mean and variance for geometric distribution

Uniform Parameters

ξU(a,b):E[ξ]=a+b2,Var(ξ)=(ba)212\xi \sim U(a,b): \quad E[\xi] = \frac{a+b}{2}, \quad \text{Var}(\xi) = \frac{(b-a)^2}{12}

Mean and variance for uniform distribution

Normal Parameters

ξN(μ,σ2):E[ξ]=μ,Var(ξ)=σ2\xi \sim N(\mu, \sigma^2): \quad E[\xi] = \mu, \quad \text{Var}(\xi) = \sigma^2

Distribution parameters directly give mean and variance

Exponential Parameters

ξExp(λ):E[ξ]=1λ,Var(ξ)=1λ2\xi \sim \text{Exp}(\lambda): \quad E[\xi] = \frac{1}{\lambda}, \quad \text{Var}(\xi) = \frac{1}{\lambda^2}

Mean and variance for exponential distribution

Gamma Parameters

ξΓ(α,β):E[ξ]=αβ,Var(ξ)=αβ2\xi \sim \Gamma(\alpha, \beta): \quad E[\xi] = \frac{\alpha}{\beta}, \quad \text{Var}(\xi) = \frac{\alpha}{\beta^2}

Mean and variance for gamma distribution

📐 Quick Reference Guide

Essential tips for working with random variables and probability distributions

Distribution Identification

  • Discrete: Countable outcomes → Use PMF
  • Continuous: Interval values → Use PDF
  • Normal: Bell-shaped, symmetric data
  • Exponential: Waiting times, memoryless
  • Binomial: Fixed trials, success/failure
  • Poisson: Rare events, fixed rate

Key Relationships

  • Independence: Joint = Product of marginals
  • Standardization: (X-μ)/σ ~ N(0,1)
  • Variance: Var(X) = E[X²] - (E[X])²
  • Linear transform: E[aX+b] = aE[X]+b
  • Chi-squared: Sum of squared normals
  • t-distribution: Small sample inference

🎯 Enhance Your Learning

Apply these formulas with our comprehensive learning resources and practice problems

Learn Theory

Master the theoretical foundations of random variables and probability distributions.

Start Learning

Practice Problems

Test your understanding with comprehensive practice problems and detailed solutions.

Start Practice

Next Steps

Continue with mathematical statistics and statistical inference methods.

Continue Learning

Explore more probability theory