MathIsimple
Probability Theory Prerequisites Formulas

Probability Theory Prerequisites Formulas

Complete mathematical reference for probability theory foundations including axioms, conditional probability, Bayes' theorem, and independence

Essential FoundationsMathematical PrecisionPractical Applications
📏

Probability Axioms & Basic Properties

Fundamental axioms and basic properties that define probability measures

Kolmogorov's Axioms
Axiom 1: P(A)0 for all events AAxiom 2: P(S)=1Axiom 3: P(i=1Ai)=i=1P(Ai) for disjoint Ai\begin{align} &\text{Axiom 1: } P(A) \geq 0 \text{ for all events } A \\ &\text{Axiom 2: } P(S) = 1 \\ &\text{Axiom 3: } P\left(\bigcup_{i=1}^{\infty} A_i\right) = \sum_{i=1}^{\infty} P(A_i) \text{ for disjoint } A_i \end{align}

Explanation

These three axioms completely define probability as a mathematical concept. All other probability properties derive from these axioms.

Empty Set Property
P()=0P(\emptyset) = 0

Explanation

The probability of the impossible event (empty set) is zero. This follows from the axioms.

Complement Rule
P(Ac)=1P(A)P(A^c) = 1 - P(A)

Explanation

The probability of the complement of event A equals one minus the probability of A.

Monotonicity Property
If AB, then P(A)P(B)\text{If } A \subseteq B, \text{ then } P(A) \leq P(B)

Explanation

If event A is contained in event B, then A cannot have higher probability than B.

Probability Bounds
0P(A)1 for all events A0 \leq P(A) \leq 1 \text{ for all events } A

Explanation

All probabilities lie between 0 and 1, inclusive.

Addition & Subtraction Rules

Formulas for computing probabilities of unions and differences of events

General Addition Rule
P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)

Explanation

Probability of union equals sum of individual probabilities minus their intersection to avoid double counting.

Addition Rule (Disjoint Events)
If AB=, then P(AB)=P(A)+P(B)\text{If } A \cap B = \emptyset, \text{ then } P(A \cup B) = P(A) + P(B)

Explanation

For mutually exclusive events, the union probability is simply the sum of individual probabilities.

Three Events Addition Rule
P(ABC)=P(A)+P(B)+P(C)P(AB)P(AC)P(BC)+P(ABC)\begin{align} P(A \cup B \cup C) &= P(A) + P(B) + P(C) \\ &\quad - P(A \cap B) - P(A \cap C) - P(B \cap C) \\ &\quad + P(A \cap B \cap C) \end{align}

Explanation

Inclusion-exclusion principle for three events. This generalizes to n events.

Difference Rule
P(AB)=P(ABc)=P(A)P(AB)P(A - B) = P(A \cap B^c) = P(A) - P(A \cap B)

Explanation

Probability of A but not B equals probability of A minus probability of both A and B.

Symmetric Difference
P(AB)=P(AB)P(AB)=P(A)+P(B)2P(AB)P(A \triangle B) = P(A \cup B) - P(A \cap B) = P(A) + P(B) - 2P(A \cap B)

Explanation

Probability of exactly one of A or B occurring (but not both).

🔀

Conditional Probability

Formulas involving conditional probability and related concepts

Conditional Probability Definition
P(BA)=P(AB)P(A),P(A)>0P(B|A) = \frac{P(A \cap B)}{P(A)}, \quad P(A) > 0

Explanation

Conditional probability of B given A is the ratio of their intersection to the probability of A.

Multiplication Rule
P(AB)=P(A)×P(BA)=P(B)×P(AB)P(A \cap B) = P(A) \times P(B|A) = P(B) \times P(A|B)

Explanation

Probability of both events equals probability of one times conditional probability of the other.

Chain Rule (Multiple Events)
P(A1A2An)=P(A1)×P(A2A1)×P(A3A1A2)××P(AnA1An1)\begin{align} &P(A_1 \cap A_2 \cap \cdots \cap A_n) \\ &= P(A_1) \times P(A_2|A_1) \times P(A_3|A_1 \cap A_2) \\ &\quad \times \cdots \times P(A_n|A_1 \cap \cdots \cap A_{n-1}) \end{align}

Explanation

Generalization of multiplication rule to multiple events using sequential conditioning.

Conditional Complement
P(BcA)=1P(BA)P(B^c|A) = 1 - P(B|A)

Explanation

Conditional probability of complement follows the same rule as unconditional complements.

Conditional Addition Rule
P(BCA)=P(BA)+P(CA)P(BCA)P(B \cup C|A) = P(B|A) + P(C|A) - P(B \cap C|A)

Explanation

Addition rule applies to conditional probabilities with the same conditioning event.

🔄

Law of Total Probability

Formulas for computing probabilities using partitions of the sample space

Law of Total Probability
If {B1,B2,,Bn} partitions S, thenP(A)=i=1nP(ABi)P(Bi)\text{If } \{B_1, B_2, \ldots, B_n\} \text{ partitions } S, \text{ then} \\ P(A) = \sum_{i=1}^{n} P(A|B_i) P(B_i)

Explanation

Any probability can be computed by conditioning on a partition of the sample space.

Two-Event Partition
P(A)=P(AB)P(B)+P(ABc)P(Bc)P(A) = P(A|B)P(B) + P(A|B^c)P(B^c)

Explanation

Simplest form using event B and its complement as the partition.

Continuous Version
P(A)=P(AB=b)fB(b)dbP(A) = \int P(A|B = b) f_B(b) db

Explanation

When conditioning on a continuous random variable B with density f_B.

Conditional Total Probability
If {B1,,Bn} partitions S, thenP(AC)=i=1nP(ABiC)P(BiC)\text{If } \{B_1, \ldots, B_n\} \text{ partitions } S, \text{ then} \\ P(A|C) = \sum_{i=1}^{n} P(A|B_i \cap C) P(B_i|C)

Explanation

Law of total probability can be applied within conditional probability framework.

🎯

Bayes' Theorem

Formulas for updating probabilities based on new evidence

Bayes' Theorem (Basic Form)
P(BA)=P(AB)P(B)P(A),P(A)>0P(B|A) = \frac{P(A|B) P(B)}{P(A)}, \quad P(A) > 0

Explanation

Updates prior probability P(B) to posterior probability P(B|A) using likelihood P(A|B).

Bayes' Theorem (Expanded Form)
P(BjA)=P(ABj)P(Bj)i=1nP(ABi)P(Bi)P(B_j|A) = \frac{P(A|B_j) P(B_j)}{\sum_{i=1}^{n} P(A|B_i) P(B_i)}

Explanation

Full form when events {B₁, B₂, ..., Bₙ} form a partition of the sample space.

Odds Form of Bayes' Theorem
P(BA)P(BcA)=P(AB)P(ABc)×P(B)P(Bc)\frac{P(B|A)}{P(B^c|A)} = \frac{P(A|B)}{P(A|B^c)} \times \frac{P(B)}{P(B^c)}

Explanation

Bayes' theorem expressed in odds form: posterior odds = likelihood ratio × prior odds.

Sequential Bayes Updates
P(BA1A2)=P(A2BA1)P(BA1)P(A2A1)P(B|A_1 \cap A_2) = \frac{P(A_2|B \cap A_1) P(B|A_1)}{P(A_2|A_1)}

Explanation

Updating probability sequentially as new evidence becomes available.

Bayes Factor
BF12=P(AB1)P(AB2)\text{BF}_{12} = \frac{P(A|B_1)}{P(A|B_2)}

Explanation

Ratio of likelihoods comparing two hypotheses B₁ and B₂ given evidence A.

⚖️

Independence

Formulas defining and working with independent events

Independence Definition
Events A and B are independent if P(AB)=P(A)P(B)\text{Events } A \text{ and } B \text{ are independent if } P(A \cap B) = P(A) P(B)

Explanation

Independence means the occurrence of one event doesn't affect the probability of the other.

Conditional Independence Criterion
A and B independent P(BA)=P(B) (when P(A)>0)A \text{ and } B \text{ independent } \Leftrightarrow P(B|A) = P(B) \text{ (when } P(A) > 0\text{)}

Explanation

Alternative characterization: A and B are independent if conditioning on A doesn't change P(B).

Independence of Complements
If A and B are independent, then:A and Bc,  Ac and B,  Ac and Bc are all independent\text{If } A \text{ and } B \text{ are independent, then:} \\ A \text{ and } B^c, \; A^c \text{ and } B, \; A^c \text{ and } B^c \text{ are all independent}

Explanation

Independence is preserved under complementation.

Mutual Independence (n events)
Events A1,,An are mutually independent ifP(iIAi)=iIP(Ai)for every subset I{1,2,,n}\text{Events } A_1, \ldots, A_n \text{ are mutually independent if} \\ P\left(\bigcap_{i \in I} A_i\right) = \prod_{i \in I} P(A_i) \\ \text{for every subset } I \subseteq \{1, 2, \ldots, n\}

Explanation

Requires independence of all possible intersections, not just pairwise independence.

Pairwise Independence
Events A1,,An are pairwise independent ifP(AiAj)=P(Ai)P(Aj) for all ij\text{Events } A_1, \ldots, A_n \text{ are pairwise independent if} \\ P(A_i \cap A_j) = P(A_i) P(A_j) \text{ for all } i \neq j

Explanation

Weaker than mutual independence - only requires pairwise products to factor.

Independence and Union
If A and B are independent, thenP(AB)=P(A)+P(B)P(A)P(B)=1P(Ac)P(Bc)\text{If } A \text{ and } B \text{ are independent, then} \\ P(A \cup B) = P(A) + P(B) - P(A)P(B) = 1 - P(A^c)P(B^c)

Explanation

Formula for union probability when events are independent.

🔧 Real-World Applications

See how probability theory prerequisites are applied in various fields

Medical Diagnosis
Using Bayes' theorem to calculate probability of disease given test results
P(DiseasePositive)=P(PositiveDisease)×P(Disease)P(Positive)P(\text{Disease}|\text{Positive}) = \frac{P(\text{Positive}|\text{Disease}) \times P(\text{Disease})}{P(\text{Positive})}

Example: Accounting for disease prevalence, test sensitivity, and specificity

Quality Control
Calculating probability of defective items from multiple production lines
P(Defective)=iP(DefectiveLinei)×P(Linei)P(\text{Defective}) = \sum_{i} P(\text{Defective}|\text{Line}_i) \times P(\text{Line}_i)

Example: Using law of total probability across production facilities

Communication Systems
Error probability in digital transmission with multiple paths
P(Error)=1i(1P(Errori))P(\text{Error}) = 1 - \prod_{i} (1 - P(\text{Error}_i))

Example: When transmission paths fail independently

Risk Assessment
Combining independent risk factors in financial or engineering contexts
P(System Failure)=1i(1P(Component Failurei))P(\text{System Failure}) = 1 - \prod_{i} (1 - P(\text{Component Failure}_i))

Example: When component failures are independent events

📋 Quick Reference Summary

Essential formulas you'll use most frequently in probability theory

Basic Rules
P(Ac)=1P(A)P(A^c) = 1 - P(A)
P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)
P(AB)=P(A)P(AB)P(A - B) = P(A) - P(A \cap B)
Conditional Probability
P(BA)=P(AB)P(A)P(B|A) = \frac{P(A \cap B)}{P(A)}
P(AB)=P(A)×P(BA)P(A \cap B) = P(A) \times P(B|A)
P(A)=iP(ABi)P(Bi)P(A) = \sum_i P(A|B_i) P(B_i)
Bayes & Independence
P(BA)=P(AB)P(B)P(A)P(B|A) = \frac{P(A|B) P(B)}{P(A)}
P(AB)=P(A)P(B)P(A \cap B) = P(A) P(B) (independence)
P(BA)=P(B)P(B|A) = P(B) (independence)

🚀 Master These Formulas

These probability theory prerequisites are essential for understanding stochastic processes. Practice applying them to build strong foundations.

Practice Problems

Apply these formulas with interactive exercises and step-by-step solutions.

Start Practice

Learn Theory

Understand the concepts behind these formulas with detailed explanations.

Learn More

Use Calculators

Practice calculations with interactive probability calculators and tools.

Try Calculators

Build your foundation