Back to Blog

Recurrence Relations and the Master Theorem

How to solve recurrence relations using the Master Theorem, recursion tree method, and substitution method, with a detailed walkthrough of T(n) = 2T(n/2) + n.

2025-10-16
Share
Theory of Computationrecurrencesmaster-theorem

Terminology

Term Definition
Recurrence relation An equation that defines $T(n)$ in terms of $T$ on smaller inputs, e.g. $T(n) = 2T(n/2) + n$
Base case The value of $T(n)$ for the smallest input, typically $T(1) = O(1)$, which stops the recursion
Master Theorem A formula that directly solves recurrences of the form $T(n) = aT(n/b) + f(n)$ by comparing $f(n)$ to $n^{\log_b a}$
Recursion tree A visual method that expands the recurrence into a tree, summing the work at each level to find the total
Substitution method Guessing the form of the solution and proving it correct by mathematical induction
Branching factor ($a$) The number of recursive subproblems at each level of the recurrence
Shrinkage factor ($b$) The factor by which the problem size decreases at each recursive call
Critical exponent The value $\log_b a$, which represents the growth rate of the number of subproblems across levels
Combine cost ($f(n)$) The non-recursive work done at each level: splitting the problem and merging results

What & Why

Recursive algorithms break a problem into smaller subproblems, solve each one, and combine the results. To analyze their running time, you write a recurrence relation that captures this structure. For example, merge sort splits an array in half, recursively sorts both halves, and merges them: T(n) = 2T(n/2) + O(n).

The challenge is converting that recursive definition into a closed-form expression like \Theta(n \log n). Three methods handle this: the Master Theorem (a direct formula for a common pattern), the recursion tree method (visual expansion), and the substitution method (guess and prove by induction). Each has strengths depending on the recurrence's shape.

How It Works

The Master Theorem

For recurrences of the form T(n) = aT(n/b) + f(n) where a \ge 1 and b > 1:

Compute the critical exponent c^* = \log_b a. Then compare f(n) to n^{c^*}:

Case 1: If f(n) = O(n^{c^* - \epsilon}) for some \epsilon > 0, then T(n) = \Theta(n^{c^*}). The leaf work dominates.

Case 2: If f(n) = \Theta(n^{c^*} \log^k n) for k \ge 0, then T(n) = \Theta(n^{c^*} \log^{k+1} n). Work is evenly spread across levels.

Case 3: If f(n) = \Omega(n^{c^* + \epsilon}) for some \epsilon > 0 and af(n/b) \le cf(n) for some c < 1, then T(n) = \Theta(f(n)). The root work dominates.

Master Theorem: Three Cases Case 1: Leaf-heavy f(n) grows slower than $n^{c^*}$ T(n) = Theta(n^c*) Example: T(n)=8T(n/2)+n = Theta(n^3) Case 2: Balanced f(n) matches $n^{c^*}$ (with logs) T(n) = Theta(n^c* log n) Example: T(n)=2T(n/2)+n = Theta(n log n) Case 3: Root-heavy f(n) grows faster than $n^{c^*}$ T(n) = Theta(f(n)) Example: T(n)=2T(n/2)+n^2 = Theta(n^2)

Recursion Tree Method

Expand the recurrence level by level, compute the work at each level, and sum across all levels.

For T(n) = 2T(n/2) + n:

Level Cost 0 n n 1 n/2 n/2 n 2 n/4 n/4 n/4 n/4 n ... log n 1 ... n leaves ... n

Each level costs n. There are \log_2 n + 1 levels. Total: n \cdot (\log_2 n + 1) = \Theta(n \log n).

Substitution Method

Guess the answer and prove it by induction.

Claim: T(n) = 2T(n/2) + n has solution T(n) \le cn \log n for some constant c > 0.

Inductive step: Assume T(k) \le ck \log k for all k < n.

$T(n) = 2T(n/2) + n \le 2c(n/2)\log(n/2) + n = cn(\log n - 1) + n = cn\log n - cn + n$

This is \le cn \log n when c \ge 1. The bound holds.

Complexity Analysis

Recurrence Master Case Solution Algorithm
$T(n) = 2T(n/2) + n$ Case 2 $\Theta(n \log n)$ Merge sort
$T(n) = T(n/2) + O(1)$ Case 2 $\Theta(\log n)$ Binary search
$T(n) = 2T(n/2) + O(1)$ Case 1 $\Theta(n)$ Tree traversal
$T(n) = 8T(n/2) + n$ Case 1 $\Theta(n^3)$ Naive matrix multiply
$T(n) = 7T(n/2) + n^2$ Case 1 $\Theta(n^{\log_2 7}) \approx \Theta(n^{2.81})$ Strassen multiply
$T(n) = 2T(n/2) + n^2$ Case 3 $\Theta(n^2)$ Certain divide-and-conquer

The critical exponent $c^* = \log_b a$ is the key. For merge sort: $\log_2 2 = 1$, and $f(n) = n = \Theta(n^1)$, matching Case 2.

Implementation

ALGORITHM SolveMasterTheorem(a, b, f_exponent)
INPUT: a: number of subproblems, b: shrinkage factor, f_exponent: exponent of f(n) = n^f_exponent
OUTPUT: asymptotic solution class
BEGIN
  c_star <- LOG(a) / LOG(b)

  IF f_exponent < c_star THEN
    RETURN "Case 1: T(n) = Theta(n^c_star)"
  ELSE IF f_exponent = c_star THEN
    RETURN "Case 2: T(n) = Theta(n^c_star * log n)"
  ELSE IF f_exponent > c_star THEN
    RETURN "Case 3: T(n) = Theta(n^f_exponent)"
  END IF
END

ALGORITHM RecursionTreeSum(a, b, f, n)
INPUT: a: branching factor, b: shrinkage factor, f: cost function, n: problem size
OUTPUT: total work across all levels
BEGIN
  total <- 0
  level <- 0
  size <- n

  WHILE size >= 1 DO
    nodes_at_level <- a^level
    work_per_node <- f(size)
    total <- total + nodes_at_level * work_per_node
    size <- size / b
    level <- level + 1
  END WHILE

  RETURN total
END

ALGORITHM SubstitutionVerify(T, guess, c, n)
INPUT: T: recurrence function, guess: proposed bound function, c: constant, n: input size
OUTPUT: TRUE if T(n) <= c * guess(n)
BEGIN
  IF n <= 1 THEN
    RETURN T(1) <= c * guess(1)
  END IF

  actual <- T(n)
  bound <- c * guess(n)
  RETURN actual <= bound
END

Real-World Applications

  • Merge sort analysis: the recurrence $T(n) = 2T(n/2) + n$ directly gives the $\Theta(n \log n)$ guarantee that makes merge sort a go-to sorting algorithm
  • Strassen's matrix multiplication: reducing 8 recursive multiplications to 7 changes the recurrence from $T(n) = 8T(n/2) + n^2$ to $T(n) = 7T(n/2) + n^2$, dropping complexity from $\Theta(n^3)$ to $\Theta(n^{2.81})$
  • Binary search variants: every divide-and-conquer algorithm has a recurrence; understanding the Master Theorem lets you instantly classify new algorithms
  • Compiler optimization: compilers analyze recursive function call patterns to decide whether to apply tail-call optimization or loop unrolling
  • Parallel algorithm design: the span (critical path length) of parallel divide-and-conquer algorithms follows recurrences that determine parallelism efficiency

Key Takeaways

  • Recurrence relations capture the running time of recursive algorithms by expressing $T(n)$ in terms of smaller subproblems
  • The Master Theorem solves $T(n) = aT(n/b) + f(n)$ by comparing $f(n)$ to the critical exponent $n^{\log_b a}$
  • The recursion tree method is the most intuitive: expand, sum each level, and add up all levels
  • The substitution method is the most rigorous: guess the answer and prove it by induction
  • Most common algorithms (merge sort, binary search, Strassen) fall neatly into one of the three Master Theorem cases