Discussion and examples
1. Consider the code given below 
#include <bits/stdc++.h>
using namespace std;
int32_t main(){
int a = 1, b = 2;
int prod = a * b;
cout << prod << endl;
return 0;
}
Space Complexity: O(1)
Explanation:
We create 3 variables in the above program and print them. The exact memory consumed by this program will be (assuming int = 4 bytes, it can be 2 bytes in some old 16bit computers) will be equal to 3 * 4 = 12 bytes.
2. Consider the code given below 
#include <bits/stdc++.h>
using namespace std;
int32_t main(){
int n = 10;
int a[n];
for(int i = 0; i < n; i++){
// the loop will run 5 times
a[i] = i;
}
return 0;
}
Space Complexity: O(n)
Explanation:
We create an array of integers “a” of size n. In addition to this, we create two variables, “i” and “n”, contributing 4 bytes each, and there’s also “a” that’s a pointer to the initial element of the array, consuming 4 bytes. In total, we get 4 * n + 12 bytes. The 12 byte part, however, is just a constant, and thus, the memory being consumed by the above program can be said to be O(n) in terms of bigO notation.
3. Consider the code given below 
#include <bits/stdc++.h>
using namespace std;
int stack_memory(int n){
if(n == 1) return 1;
return n + stack_memory(n / 2);
}
int32_t main(){
int n = 10;
cout << stack_memory(n) << endl;
return 0;
}
Space Complexity: O(log_{2}(n))
Explanation:
This is a slightly different case. In this case, because of recursion, the previous states have to be stored onto the stack memory at every call of the function stack_memory(). The function has to be called floor(log_{2}(n))+1 times in total. Hence, we are storing the floor(log_{2}(n))+1 states onto the stack, rest of the other memory being consumed due to other temporary variables is a constant and therefore, the space complexity is O(log_{2}(n)). As shown below, the same code can be implemented with a while loop with O(1) space complexity.
#include <bits/stdc++.h>
using namespace std;
int32_t main(){
int n = 10;
int ans = 1;
while(n != 1){
ans += n;
n /= 2;
}
cout << ans << endl;
}
Recursion also does the same task with the same result. However, the iterative approach doesn’t consume extra space of O(log_{2}(n)). There is also a catch here in the case of tail recursion; when recursive functions don’t store previous calls on to the stack memory, refer here for more about tail recursion.
Also see, Longest Common Substring
FAQs

List an example where increasing the Space complexity led to improvements in time complexity.
A famous example for this case is merge sort that consumes O(n) memory and O(n * log(n)) time, while bubble sort takes O(1) memory, but you have to instead pay in the form O(n^{2}) time. This tradeoff allows a significant boost from an O(n^{2}) time algorithm to an O(n * log(n)) time algorithm. There are an almost infinite number of such cases, for example, all dynamic programming algorithms.

What is Auxillary Memory?
Auxiliary memory refers to the cheap, slow and large memory units such as HDDs in our computer. Most of the information related to a program is stored for long term storage when not being used.

If a recursive approach is always slower and consumes memory, should one ever use it?
The recursive method is often slower and consumes stack memory on each consecutive function call compared to an iterative approach. It shines when you need to code faster and write smaller functions that are less prone to bugs, such as the case with a tower of Hanoi problem that would otherwise be difficult to solve without recursion. This may be your need in a contest with limited time. For example, sometimes, it’s straightforward to code a dynamic programming problem using recursion with fewer missed edge cases.

What is stack memory?
It’s a memory usage method that allows accessing and storing memory chunks temporarily in a lastinfirstout manner, i.e., one can access the last inserted chunk of data first.

What is Tail Recursion?
Tail recursion occurs when the last line to be executed in a function is the same function call itself. In such a case, there is no need to store the previous states of the functions, and as such, no extra stack memory is consumed (all this is taken care of by the compiler). Euclidean’s GCD algorithm’s recursive function code is an excellent example of tail recursion.
Key Takeaways
This article covers the basics of space complexity and discusses its importance and necessary tradeoffs between space and time complexity to achieve an algorithm for our needs. To understand the blog better, refer to the content here about time complexity analysis, and refer here for a quick revision on asymptotic notations.
Read More  Time Complexity of Sorting Algorithms
Auxiliary memory
Learn more about the C++ STL library from here if you have trouble understanding some part of the code. Visit the link here for carefully crafted courses on campus placement and interview preparation on coding ninjas.
Happy learning!