**Introduction**

You may have at various points heard about Space Complexity in discussions related to the analysis of multiple algorithms. This is an essential topic for interview preparations and for building a better understanding of various algorithms. This article will help you understand the importance of considering Space Complexity when designing or learning a new algorithm.

Also see, __Morris Traversal for Inorder____ __and __Rabin Karp Algorithm__

**What is Space Complexity?**

Space Complexity is the measure of memory consumed by a program to operate on the input of a given size. Hence, Space complexity is essentially the sum of Auxillary Memory used and the memory used by input. However, this definition isnβt popularly used for comparing algorithms; otherwise, the space complexity of bubble sort and merge sort would be the same as O(n).

This is why itβs best to say that certain algorithms like bubble sort requires O(1) extra memory, and merge sort require O(n) extra memory when the discussion is about the complexity analysis of algorithms.

Read More About, __Data Structures and Algorithms____ __and __Application of Graph in Data Structure__

**Discussion and examples**

**1. Consider the code given below - **

```
#include <bits/stdc++.h>
using namespace std;
int32_t main(){
int a = 1, b = 2;
int prod = a * b;
cout << prod << endl;
return 0;
}
```

**Space Complexity:** O(1)

**Explanation:**

We create 3 variables in the above program and print them. The exact memory consumed by this program will be (assuming int = 4 bytes, it can be 2 bytes in some old 16-bit computers) will be equal to 3 * 4 = 12 bytes.

**2. Consider the code given below - **

```
#include <bits/stdc++.h>
using namespace std;
int32_t main(){
int n = 10;
int a[n];
for(int i = 0; i < n; i++){
// the loop will run 5 times
a[i] = i;
}
return 0;
}
```

**Space Complexity: **O(n)

**Explanation:**

We create an array of integers βaβ of size n. In addition to this, we create two variables, βiβ and βnβ, contributing 4 bytes each, and thereβs also βaβ thatβs a pointer to the initial element of the array, consuming 4 bytes. In total, we get 4 * n + 12 bytes. The 12 byte part, however, is just a constant, and thus, the memory being consumed by the above program can be said to be O(n) in terms of big-O notation.

**3. Consider the code given below -**

```
#include <bits/stdc++.h>
using namespace std;
int stack_memory(int n){
if(n == 1) return 1;
return n + stack_memory(n / 2);
}
int32_t main(){
int n = 10;
cout << stack_memory(n) << endl;
return 0;
}
```

**Space Complexity:** O(log_{2}(n))

**Explanation:**

This is a slightly different case. In this case, because of recursion, the previous states have to be stored onto the stack memory at every call of the function stack_memory(). The function has to be called floor(log_{2}(n))+1 times in total. Hence, we are storing the floor(log_{2}(n))+1 states onto the stack, rest of the other memory being consumed due to other temporary variables is a constant and therefore, the space complexity is O(log_{2}(n)). As shown below, the same code can be implemented with a while loop with O(1) space complexity.

```
#include <bits/stdc++.h>
using namespace std;
int32_t main(){
int n = 10;
int ans = 1;
while(n != 1){
ans += n;
n /= 2;
}
cout << ans << endl;
}
```

Recursion also does the same task with the same result. However, the iterative approach doesnβt consume extra space of O(log_{2}(n)). There is also a catch here in the case of tail recursion; when recursive functions donβt store previous calls on to the stack memory, refer __here__ for more about tail recursion.

Also see, __Longest Common Substring__

**FAQs**

**List an example where increasing the Space complexity led to improvements in time complexity.**

A famous example for this case is merge sort that consumes O(n) memory and O(n * log(n)) time, while bubble sort takes O(1) memory, but you have to instead pay in the form O(n^{2}) time. This tradeoff allows a significant boost from an O(n^{2}) time algorithm to an O(n * log(n)) time algorithm. There are an almost infinite number of such cases, for example, all dynamic programming algorithms.

**What is Auxillary Memory?**

Auxiliary memory refers to the cheap, slow and large memory units such as HDDs in our computer. Most of the information related to a program is stored for long term storage when not being used.

**If a recursive approach is always slower and consumes memory, should one ever use it?**

The recursive method is often slower and consumes stack memory on each consecutive function call compared to an iterative approach. It shines when you need to code faster and write smaller functions that are less prone to bugs, such as the case with a tower of Hanoi problem that would otherwise be difficult to solve without recursion. This may be your need in a contest with limited time. For example, sometimes, itβs straightforward to code a dynamic programming problem using recursion with fewer missed edge cases.

**What is stack memory?**

Itβs a memory usage method that allows accessing and storing memory chunks temporarily in a last-in-first-out manner, i.e., one can access the last inserted chunk of data first.

**What is Tail Recursion?**

Tail recursion occurs when the last line to be executed in a function is the same function call itself. In such a case, there is no need to store the previous states of the functions, and as such, no extra stack memory is consumed (all this is taken care of by the compiler). Euclideanβs GCD algorithmβs recursive function code is an excellent example of tail recursion.

**Key Takeaways**

This article covers the basics of space complexity and discusses its importance and necessary tradeoffs between space and time complexity to achieve an algorithm for our needs. To understand the blog better, refer to the content __here__ about time complexity analysis, and refer __here__ for a quick revision on asymptotic notations.

Read More - __Time Complexity of Sorting Algorithms__

Learn more about the C++ STL library from __here__ if you have trouble understanding some part of the code. Visit the link __here__ for carefully crafted courses on campus placement and interview preparation on coding ninjas.

Happy learning!