**Introduction**

Recurrence relations are equations that describe themselves. We encounter recurrences in various situations when we have to analyze specific algorithms, especially those that follow the __Divide and Conquer__ Paradigm. It's essential to have tools to solve these recurrences for time complexity analysis, and here the Master's method comes into the picture.

Read More About, __Data Structures and Algorithms__

**Master Method**

In the Master method, we have a known recurrence, and if they satisfy any of the formats that can be solved by Master's Method, then we have a straightforward solution to this usually tedious task. This method can give us tight bounds on the complexity of our algorithm and is one of the most famous methods for analyzing the complexity of recursive algorithms. The main reason for its popularity is that the commonly encountered recurrences are generally covered by the Master's Method.

The steps to use the Master method are as follows.

- You identify if the recurrence fits into the pattern.
- If it fits into the recurrence pattern, we finally get our answer by substituting the values obtained by comparing the pattern with our recurrence.

Also see, __Longest Common Substring__

**Master Method's statements**

In the below equation, the constant a >= 1, b > 1 and f(n) is a function that’s always positive for some n > N > 0.

For the pattern of recurrence below.

T(n) = a * T(n / b) + f(n)

The case below that is satisfied determines our final answer.

- If f(n) = O(n
^{logb}^{(}^{a)-ϵ}), then T(n) = Θ(n^{logb(a)}). - If f(n) = Θ(n
^{logb}^{(}^{a)}), then T(n) = Θ(n^{logb(a)}* log(n)). - If f(n) = Ω(n
^{logb}^{(}^{a)+ϵ}), then T(n) = Θ(f(n)).

ϵ > 0 for the above equations. Let's discuss a few examples below.

**Examples of Master Method**

We'll take three examples below to help you get used to the Master's Method.

1. T(n) = 3 * T(n / 2) + n

In the above equation a = 3 >= 1, b = 2 > 1 and f(n) = n, once you compare it with th

Here, n^{logb(a)} is n^{log2(3)}. log2(3) ≈ 1.58. f(n) < nlog2(3) - ϵ for some ϵ > 0.

Therefore, we follow case 1, and our answer is Ө(n^{log2(3)}).

2. T(n) = 4 * T(n / 2) + n^{2}

In the above equation a = 4 >= 1, b = 2 > 1 and f(n) = n^{2}.

Here, n^{logb(a)} is n^{log2(4)}. log2(4) = 2. f(n) = Ө(n^{2})

Therefore, we follow case 2, and our answer is Ө(nlog2(3) * log(n)).

3. T(n) = 3 * T(n / 2) + n^{2}

In the above equation a = 3 >= 1, b = 2 > 1 and f(n) = n^{2}.

Here, n^{logb(a)} is n^{log2(3)}. log2(3) ≈ 1.58. f(n) > n^{log2(3) + ϵ} for some ϵ > 0.

Therefore, we follow case 3, and our answer is Ө(n^{2}).

**Limitations of Master Method**

- T(n) must be monotonic. Example - If T(n) = sin(n), then we cant use master method.
- f(n) must be a polynomial function, e.g. f(n) can’t be 1 / n, etc. Master's Method can't solve the recurrence, T(n) = T(n / 2) + sqrt(n). There’s a catch for this one though and that’s when f(n) = Ө(n
^{logb(a)}log^{k}(n)) (this is an additional case), here k >= 0. Then we have our answer as T(n) = Ө(n^{logb(a)}log^{k+1}(n)). - "a" or "b" must be constants, i.e., they can't be a dependent variable. We can't solve the recurrence T(n) = n * T(n/2) + 1 using Master's Method as a = n (as a is dependent on the input size i.e. n).

Also see, __Morris Traversal for Inorder__.

**FAQs**

**1. What are recurrences and their uses?**

A recurrence can describe any sequence of elements in an array in terms of themselves. Recurrences are used in various branches of mathematics in addition to Computer Science (in time complexity analysis) their application in finding the bound for the number of trivial operations by an algorithm, such as Number Theory (Fibonacci sequence), Calculus (Euler's Theorem), etc.

An example of a recurrence,

```
T(N) = T(N/2) + 1, (n > 1)
T(1) = 1
```

**2. Why should we learn about different methods for solving recurrences?**

Many methods make it easier for us to solve particular problems. Depending on the requirement, they can also help us (depending on if we need a tight bound or just an upper bound), and there are recurrences where you may not apply specific methods.

For example, the Substitution method can solve essentially any recurrence.

**3. What does O(g(n)), Ө(g(n)), Ω(g(n)) mean?**

Here,

If f(n) = O(g(n)), then f(n) < c * g(n), ∀ n > N > 0 and some c > 0.

If f(n) = Ө(g(n)), then c2 * g(n) < f(n) < c1 * g(n), ∀ n > N > 0 and some c1, c2 > 0.

If f(n) = Ω(g(n)), then f(n) > c * g(n), ∀ n > N > 0 and some c > 0.

**4. What is a polynomial function?**

Polynomial functions are of the form, f(n) = a_{n}x^{n} + a_{n-1}x^{n-1} + …. + a_{2}x^{2} + a_{1}x + a_{0}. Here a_{k} is a real number, 0 <= k <= n. n must be the set of whole numbers (all natural numbers including 0). Therefore, f(n) = sqrt(n) isn’t polynomial and neither is f(n) = 1/n. However, f(n) = 0 is a polynomial.

**5. What is a monotonic function?**

It refers to a function that is either entirely non-increasing or entirely non-decreasing.

**Key Takeaways**

This article covers the Master Method, one of the easiest methods to find an asymptotic bound on a recurrence. Other methods to achieve similar objectives are Substitution, Iteration and Recursion Tree. To understand the blog better, refer to the article __here__ about Understanding of Analysis of Algorithms and __here__ about Asymptotic Notations.

Learn more about the C++ STL library from __here__ if you have trouble understanding some part of the code. Visit the link __here__ for carefully crafted courses on campus placement and interview preparation on coding ninjas.

Happy learning!