Master precise probability computation methods without approximation. Learn variable elimination, belief propagation, and when exact inference is feasible for acyclic graphs.
In probabilistic graphical models, inference refers to computing probabilities of interest from the joint distribution. The two main types are:
Compute probability of variable set by summing over all possible values of other variables .
Compute probability of given evidence using Bayes' rule.
Variable elimination is a general exact inference algorithm that eliminates variables one by one through step-by-step integration (for continuous) or summation (for discrete).
Given joint distribution :
Step 1: Eliminate
Step 2: Eliminate
Step 3: Eliminate
Result: (marginal probability of ).
Time complexity depends on elimination order. Optimal order minimizes the size of intermediate factors. For tree structures, variable elimination is efficient. For graphs with cycles, complexity can be exponential.
Belief Propagation (BP), also called the sum-product algorithm, is an efficient exact inference method for tree-structured graphs (acyclic graphs). It works by passing messages between nodes.
Message from node to node :
Where:
After message passing, marginal probability of node :
Normalize to ensure probabilities sum to 1.
Exact inference is most efficient for tree-structured graphs (no cycles):
For graphs with cycles, exact inference becomes intractable:
For small graphs (few variables, small state spaces), exact inference is always feasible: