Comprehensive collection of Bayesian statistical formulas including Bayes' theorem, conjugate priors, posterior distributions, credible intervals, and computational methods.
Posterior = (Likelihood × Prior) / Evidence
Often used form ignoring the normalizing constant
Evidence obtained by integrating over all parameter values
Beta prior is conjugate to Binomial likelihood
Gamma prior is conjugate to Poisson likelihood
Normal prior for mean with known variance
Inverse Gamma prior for variance with known mean
Minimizes expected squared loss
Minimizes expected absolute loss
Maximum A Posteriori (MAP) estimator
α/2 probability in each tail
Shortest interval with given probability content
Large-sample normal approximation
Distribution of future observations given past data
Prediction for k successes in m future Bernoulli trials
Prediction for Normal model with Normal prior
Ratio of marginal likelihoods comparing two models
Bayesian model averaging weights
Bayesian model selection criterion
MCMC acceptance probability
Sample each parameter from its full conditional
Approximate posterior via optimization