MathIsimple
DM-06
Course 6

Algorithm Complexity

Learn to analyze algorithm efficiency using asymptotic notation. Understand Big-O, Big-Omega, and Big-Theta to compare and classify algorithms by their growth rates.

Learning Objectives
  • Understand the purpose of asymptotic analysis
  • Apply Big-O notation for upper bounds
  • Apply Big-Omega notation for lower bounds
  • Use Big-Theta for tight bounds
  • Analyze common algorithm patterns
  • Compare complexity classes

1. Introduction to Algorithm Analysis

Definition 6.1: Algorithm

An algorithm is a finite sequence of precise instructions for solving a problem or performing a computation.

Algorithm analysis studies the resources (time, space) an algorithm requires as a function of input size.

Remark

Why Asymptotic Analysis?

  • Exact running time depends on hardware, implementation details
  • We care about scalability: how does time grow with input size?
  • Asymptotic notation ignores constants and focuses on growth rate
  • Allows fair comparison between algorithms
Example: Comparing Growth Rates

Consider algorithms with running times:

  • Algorithm A: TA(n)=1000nT_A(n) = 1000n
  • Algorithm B: TB(n)=n2T_B(n) = n^2

For small n, A seems worse. But for large n:

  • n = 100: A takes 100,000; B takes 10,000
  • n = 10,000: A takes 10,000,000; B takes 100,000,000

Algorithm A (linear) eventually beats B (quadratic) for large inputs.

2. Big-O Notation (Upper Bound)

Definition 6.2: Big-O Notation

Let f,g:NR+f, g: \mathbb{N} \to \mathbb{R}^+. We say f(n)=O(g(n))f(n) = O(g(n)) if there exist positive constants cc and n0n_0 such that:

f(n)cg(n)for all nn0f(n) \leq c \cdot g(n) \quad \text{for all } n \geq n_0

This means f(n)f(n) grows at most as fast as g(n)g(n) asymptotically.

Example: Proving Big-O

Show: 3n2+5n+7=O(n2)3n^2 + 5n + 7 = O(n^2)

Proof of 3n² + 5n + 7 = O(n²)

We need to find c and n₀ such that 3n2+5n+7cn23n^2 + 5n + 7 \leq c \cdot n^2 for n ≥ n₀.

For n ≥ 1: 5n5n25n \leq 5n^2 and 77n27 \leq 7n^2

So: 3n2+5n+73n2+5n2+7n2=15n23n^2 + 5n + 7 \leq 3n^2 + 5n^2 + 7n^2 = 15n^2

Choose c = 15 and n₀ = 1. ✓

Theorem 6.1: Big-O Rules
  • Constant factor: cf(n)=O(f(n))c \cdot f(n) = O(f(n)) for any constant c
  • Sum rule: O(f(n))+O(g(n))=O(max(f(n),g(n)))O(f(n)) + O(g(n)) = O(\max(f(n), g(n)))
  • Product rule: O(f(n))O(g(n))=O(f(n)g(n))O(f(n)) \cdot O(g(n)) = O(f(n) \cdot g(n))
  • Transitivity: If f=O(g)f = O(g) and g=O(h)g = O(h), then f=O(h)f = O(h)

3. Big-Omega and Big-Theta

Definition 6.3: Big-Omega Notation (Lower Bound)

f(n)=Ω(g(n))f(n) = \Omega(g(n)) if there exist positive constants cc and n0n_0 such that:

f(n)cg(n)for all nn0f(n) \geq c \cdot g(n) \quad \text{for all } n \geq n_0

This means f(n)f(n) grows at least as fast as g(n)g(n) asymptotically.

Definition 6.4: Big-Theta Notation (Tight Bound)

f(n)=Θ(g(n))f(n) = \Theta(g(n)) if f(n)=O(g(n))f(n) = O(g(n)) AND f(n)=Ω(g(n))f(n) = \Omega(g(n)).

Equivalently, there exist constants c1,c2,n0>0c_1, c_2, n_0 > 0 such that:

c1g(n)f(n)c2g(n)for all nn0c_1 \cdot g(n) \leq f(n) \leq c_2 \cdot g(n) \quad \text{for all } n \geq n_0
Remark

Summary of Notations:

NotationMeaningAnalogy
O(g)O(g)Upper bound
Ω(g)\Omega(g)Lower bound
Θ(g)\Theta(g)Tight bound=

4. Common Complexity Classes

Definition 6.5: Complexity Hierarchy

Common complexity classes in increasing order:

O(1)<O(logn)<O(n)<O(nlogn)<O(n2)<O(n3)<O(2n)<O(n!)O(1) < O(\log n) < O(n) < O(n \log n) < O(n^2) < O(n^3) < O(2^n) < O(n!)
Example: Complexity Classes Examples

O(1) - Constant:

Array access, hash table lookup (average)

O(log n) - Logarithmic:

Binary search, balanced tree operations

O(n) - Linear:

Linear search, single loop through array

O(n log n) - Linearithmic:

Merge sort, quicksort (average), heapsort

O(n²) - Quadratic:

Bubble sort, insertion sort, nested loops

O(2ⁿ) - Exponential:

Naive recursive Fibonacci, subset enumeration

5. Analyzing Algorithms

Algorithm 6.1: Linear Search Analysis

LinearSearch(A, x):

for i = 1 to n:

if A[i] == x: return i

return -1

Analysis:

  • Best case: O(1) - found at first position
  • Worst case: O(n) - not found or at last position
  • Average case: O(n) - expected n/2 comparisons
Algorithm 6.2: Binary Search Analysis

BinarySearch(A, x, low, high):

if low > high: return -1

mid = (low + high) / 2

if A[mid] == x: return mid

if A[mid] > x:

return BinarySearch(A, x, low, mid-1)

else:

return BinarySearch(A, x, mid+1, high)

Analysis: T(n)=T(n/2)+O(1)=O(logn)T(n) = T(n/2) + O(1) = O(\log n)

Each step halves the search space.

Note

Common Patterns:

  • Single loop over n elements → O(n)
  • Nested loops (both over n) → O(n²)
  • Divide in half each step → O(log n)
  • Divide and conquer with linear merge → O(n log n)

Practice Quiz

Algorithm Complexity Quiz
10
Questions
0
Correct
0%
Accuracy
1
If f(n)=3n2+5n+7f(n) = 3n^2 + 5n + 7, what is the Big-O complexity?
Easy
Not attempted
2
f(n)=O(g(n))f(n) = O(g(n)) means:
Easy
Not attempted
3
Which complexity class is the fastest for large n?
Easy
Not attempted
4
A nested loop where both iterate n times has complexity:
Medium
Not attempted
5
Binary search has time complexity:
Medium
Not attempted
6
Θ(g(n))\Theta(g(n)) means:
Medium
Not attempted
7
What is O(logn)+O(n)O(\log n) + O(n)?
Medium
Not attempted
8
The Master Theorem is used for:
Hard
Not attempted
9
If T(n)=2T(n/2)+nT(n) = 2T(n/2) + n, what is T(n)T(n)?
Hard
Not attempted
10
Ω(n2)\Omega(n^2) means the algorithm:
Hard
Not attempted

Frequently Asked Questions

Why do we ignore constants in Big-O?

Constants depend on implementation and hardware. What matters for scalability is the growth rate as input size increases.

An O(n) algorithm will always eventually beat O(n²), regardless of constants, because n² grows unboundedly faster than n.

When should I use Big-O vs Big-Theta?

Big-O: When you want an upper bound (worst-case guarantee).

Big-Theta: When you want to precisely characterize growth rate.

In practice, Big-O is more commonly used because we often care about worst-case behavior.

How do I analyze recursive algorithms?

Write a recurrence relation, then solve it:

  • Substitution method
  • Recursion tree method
  • Master theorem (for divide-and-conquer)

What's the difference between time and space complexity?

Time complexity: How running time grows with input size.

Space complexity: How memory usage grows with input size.

Often there's a trade-off: algorithms can use more space to save time, or vice versa.

What complexity is considered 'good'?

It depends on context, but general guidelines:

  • O(log n), O(n): Excellent - scales well
  • O(n log n): Good - acceptable for most purposes
  • O(n²): Okay for small inputs
  • O(2ⁿ), O(n!): Only for very small inputs