Analysing Algorithms Using Master Theorem
Introduction
In medieval times, mighty rulers would expand their kingdoms by using a principle called divide and conquer. But have you ever wondered why their methods were so effective?
It was because dividing a large territory into smaller ones and conquering them was easier than trying to win over the big one. Similarly, while solving a problem, it is easier to divide the problem into smaller parts that we can solve individually.
Finding the time complexity of any algorithm is a significant part of writing good code, but finding the time complexity of an algorithm with recurrence relations is tricky. So, the method of divide and conquer is used here, called the master theorem (analysis of algorithms).
In this article, we will discuss what the master theorem is. We will also see how it is used with recurrence relations to find the time complexity.
What is the Master Theorem (Analysis of Algorithms)?
Recursion is an important technique used while writing code, but we may have to asymptotically analyse the algorithm to find its efficiency based on time complexity.
There are three methods used to analyse recurrence relations:

Substitution Method: Using mathematical induction to prove an assumed solution.

Recurrence Tree Method: Adding the time taken by every level in a recurrence tree.

Master Method: Direct method to find the solution using formulas.
In this article, we will be focussing only on the master theorem. The master theorem is used to directly find the time complexity of recursive functions whose runtime can be expressed in the following form:
T(n) = a.T(n/b) + f(n), a ≥ 1 and b > 1
where n = size of the problem,
a = number of subproblems,
b = size of each subproblem,
f(n) = work done other than recursion like dividing into subproblems and combining them to get the solution
f(n) is of the form, f(n) = Ө(nk.log p n).
So, on substituting f(n),
T(n) = a.T(n/b) + Ө(nk.log p n), k ≥ 0 and p ∈ R
For example, for binary search the expression for runtime is T(n) = T(n/2) + O(1). From the general expression of runtime, there are different conditions for the different values of the parameters. Based on these conditions, the time complexity is calculated.
Let us see what those conditions are and how to calculate the time complexity.
How to use Master Theorem (Analysis of Algorithms)?
As we already saw, the runtime of an algorithm must be expressed as follows to apply the master theorem.
T(n) = a.T(n/b) + Ө(nk.log p n), a ≥ 1, b > 1, k ≥ 0 and p ∈ R
where, n = size of the problem,
a = number of subproblems,
b = size of each subproblem
Now, let us consider three cases.
Case 1: logba > k
For this condition, the time complexity will be O(nx), where x = logba
Example:

Given expression, T(n) = 2.T(n/2) + 1
Solution:
T(n) = 2.T(n/2) + 1
∴ a = 2, b = 2 and Ө(nk.log p n) = 1
1 = Ө(n0.log 0 n)
∴ k = 0 and p = 0
log b a = log 2 2 = 1 > 0
∴ log b a > k
By case 1,
x = log 2 2 = 1
Time complexity = O(n1) = O(n)
Given Formulas:
aT(n/b) + Ө(nk.log p n)
Ө(nk.log p n)
By case 1,
Time complexity = O(nx) and x = logba
x = log b a
Time complexity = O(nx)
Case 2: log b a = k
This condition is further divided into three more parts:
 If p > 1, the time complexity will be O(nk.log p+1 n)
 If p = 1, the time complexity will be O(nk.log(log n))

If p < 1, the time complexity will be O(nk)
Examples:

Given expression, T(n) = 2.T(n/2) + n
Solution:
T(n) = 2.T(n/2) + n
∴ a = 2, b = 2 and Ө(nk.log p n) = n
n = Ө(n1.log 0 n)
∴ k = 1 and p = 0
log b a = log 2 2 = 1
∴ log b a = k
p = 0 > 1
By case 2 (a),
Time complexity = O(nk.log p+1 n)
= O(n1.log 0+1 n)
= O(n.log 1 n)
= O(n.log n)
Given Formulas:
aT(n/b) + Ө(nk.log p n)
Ө(nk.log p n)
By case 2 (a),
Time complexity = O(nk.log p+1 n)
Given expression, T(n) = 2.T(n/2) + n/log n
Solution:
T(n) = 2.T(n/2) + n/log n
∴ a = 2, b = 2 and Ө(nk.log p n) = n/log n
n/log n = Ө(n1.log 1 n)
∴ k = 1 and p = 1
log b a = log 2 2 = 1
∴ log b a = k
p = 1
By case 2 (b),
Time complexity = O(nk.log(log n))
= O(n1.log(log n))
= O(n.log(log n))
Given Formulas:
aT(n/b) + Ө(nk.log p n)
Ө(nk.log p n)
By case 2 (b),
Time complexity = O(nk.log(log n))
Given expression, T(n) = 2.T(n/2) + n/log 2 n
Solution:
T(n) = 2.T(n/2) + n/log 2 n
∴ a = 2, b = 2 and Ө(nk.log p n) = n/log 2 n
n/log 2 n = Ө(n1.log 2 n)
∴ k = 1 and p = 2
log b a = log 2 2 = 1
∴ log b a = k
p = 2 < 1
By case 2 (c),
Time complexity = O(nk)
= O(n1)
= O(n)
Given Formulas:
aT(n/b) + Ө(nk.log p n)
Ө(nk.log p n)
By case 2 (c),
Time complexity = O(nk)
Case 3: logba < k
This condition if again further divided into two more parts:
 If p ≥ 0, the time complexity will be O(nk.log p n)

If p < 0, the time complexity will be O(nk)
Examples:

Given expression, T(n) = T(n/2) + n2
Solution:
T(n) = 1.T(n/2) + n2
∴ a = 1, b = 2 and Ө(nk.log p n) = n2
n2 = Ө(n2.log 0 n)
∴ k = 2 and p = 0
log b a = log 2 1 = 0
∴ log b a < k
p = 0 ≥ 0
By case 3 (a),
Time complexity = O(nk.log p n)
= O(n2.log 0 n)
= O(n2)
Given Formulas:
aT(n/b) + Ө(nk.log p n)
Ө(nk.log p n)
By case 3 (a),
Time complexity = O(nk.log p n)
Given expression, T(n) = 4.T(n/2) + n3/log n
Solution:
T(n) = 4.T(n/2) + n3/log n
∴ a = 4, b = 2 and Ө(nk.log p n) = n3/log n
n3/log n = Ө(n3.log 1 n)
∴ k = 3 and p = 1
log b a = log 2 4 = 2 < 3
∴ log b a < k
p = 1 < 0
By case 3 (b),
Time complexity = O(nk)
= O(n3)
Given Formulas:
aT(n/b) + Ө(nk.log p n)
Ө(nk.log p n)
By case 3 (b),
Time complexity = O(nk)
Practical Examples of Master Theorem (Analysis of Algorithms)
Now that we know how to use the master theorem (analysis of algorithms) let us see where it is used practically.

Binary Search: The runtime of binary search (using recursion) is as follows:
T(n) = T(n/2) + O(1)
Solution:
T(n) = 1.T(n/2) + O(1)
∴ a = 1, b = 2 and Ө(nk.log p n) = O(1)
O(1) = Ө(n0.log 0 n)
∴ k = 0 and p = 0
log b a = log 2 1 = 0
∴ log b a = k
p = 0 > 1
By case 2 (a),
Time complexity = O(nk.log p+1 n)
= O(n0.log 0+1 n)
= O(1.log 1 n)
= O(log n)
Given Formulas
aT(n/b) + Ө(nk.log p n)
Ө(nk.log p n)
By case 2 (a),
Time complexity = O(nk.log p+1 n)

Merge sort: The runtime of the algorithm for merge sort using recursion is as follows:
T(n) = 2T(n/2) + O(n)
Solution:
T(n) = 2.T(n/2) + O(n)
∴ a = 2, b = 2 and Ө(nk.log p n) = O(n)
O(n) = Ө(n1.log 0 n)
∴ k = 1 and p = 0
log b a = log 2 2 = 1
∴ log b a = k
p = 0 > 1
By case 2 (a),
Time complexity = O(nk.log p+1 n)
= O(n1.log 0+1 n)
= O(n.log 1 n)
= O(n.log n)
Given Formulas:
aT(n/b) + Ө(nk.log p n)
Ө(nk.log p n)
By case 2 (a),
Time complexity = O(nk.log p+1 n)
Limitations of Master Theorem (Analysis of Algorithms)
We have already seen various examples of analysing time complexity using the master theorem, but can we use it for any function in the given form? No.
You may be wondering when can we then?
Here are the conditions where we cannot use the master theorem:

If T(n) is not monotone, like sin n. By this, we mean that T(n) should be a monotonically increasing function.

If f(n) is not a polynomial. For example, if T(n) = 2T (n/2) + 2n, f(n) is an exponent. So, we cannot use the master’s theorem here.

If a is not a constant, For example, if T(n) = 2n.T(n/3) + n.log n, here a = 2n which is not a constant. So, T(n) cannot be analysed using the master’s theorem.
Also see, Morris Traversal for Inorder.
How do you memorise the Master Theorem?
After seeing a few Master Theorem examples, are you puzzled? Don’t be concerned, champ; it’s pretty normal.
Now we’re ready to go over the problems given below to make sure we’ve mastered all the Master Theorem concepts (Analysis of algorithms). More Examples with Answers on Master’s Theorem (Analysis of Algorithms).
Now that we know all about the master theorem, we should practice some questions ourselves.
Given below are some practice problems, along with their answers. I have added some hints, too, if you get stuck, but try not to see them without trying.
1. T (n) = 2nT (n/2) + nn
Hint: Check the value of a
Answer: Does not apply
2. T (n) = 2T (n/4)+ n0.51
Hint: Use Case 3
Answer: T (n) = O(n0.51)
3. T (n) = 0.5T (n/2)+ 1/n
Hint: Check the value of a
Answer: Does not apply
4. T (n) = √2T (n/2) + log n
Hint: Use Case 1
Answer: T (n) = (√n)
5. T (n) = 64T(n/8) − n2.log n
Hint: Check the value of f(n)
Answer: Does not apply
6. T (n) = T (n/2) + n(2 − cos n)
Hint: Check the value of f(n)
Answer: Does not apply
7. T(n) = 7T(n/2) + 3n2 + 2
Hint: Use Case 1
Answer: T(n) = O(n2.81)
8. T (n) = 3T(n/3)+ n/2
Hint: Use Case 2
Answer: T (n) = O(n log n)
9. T (n) = 3T (n/2)+ n2
Hint: Use Case 3
Answer: T (n) = O(n2)
10. T (n) = 4T (n/2) + n2
Hint: Use Case 2
Answer: T (n) = (n2 log n)
Frequently Asked Questions
What is Master Theorem?
Master’s theorem is a technique that gives us the formula to directly find the time complexity of an algorithm containing recurrence relations.
Why do we use Master Theorem?
We use the master theorem to find the time complexity of algorithms containing recurrence relations.
How many cases are there in Master Theorem?
There are three cases in Master Theorem. Any recurrence that falls into one of these three categories can be solved.
When can master theorem not be applied?
We cannot apply the master theorem if: T(n) is not monotone, f(n) is not a polynomial and a is not a constant.
What are the methods to analyse a recurrence relation?
There are three methods to analyse a recurrence relation: Substitution method, Recurrence tree method and Master method.
Conclusion
In this article, we learnt about the master theorem (analysis of algorithms) and how we use it to directly find the time complexity of an algorithm containing recurrence relations.
We also discussed a shortcut method that will help to find the answer in an instant. We practised a few problems too. Apart from this, there are other concepts that you can learn from the free video lectures on Coding Ninjas.
You can also try DSA problems available on CodeStudio. The questions there are frequently asked in interview rounds and will help you master efficient coding techniques. They also come with a bonus of interview experiences of scholars in big productbased companies.
Happy learning!