Introduction
Quick Sort is a widely used sorting algorithm in computer science. It's an efficient, comparison-based sorting algorithm that follows a divide-and-conquer approach to sort a list or an array of elements.
The reason is that we humans love to simplify our work by dividing it into smaller parts. Keeping this fact in mind, people have developed computer algorithms that follow the divide and conquer principle. One of these algorithms is the quicksort algorithm used to sort the elements in an array.
Now you may be wondering that you already know some other sorting technique, so whatβs the use of knowing another? Different algorithms have different time and space complexities. Thus, knowing quicksort is as vital as learning any other sorting algorithm. So, if you donβt know about quicksort already, just hang in there with me because together, we will learn all about quicksort in this article.
What is Quick Sort?
QuickSort is a sorting method that works by dividing a list into smaller groups, sorting those groups, and then putting them back together in order. It's like organizing a messy room by making smaller piles of stuff, arranging each pile, and then stacking them neatly.
QuickSort is like sorting a deck of cards. Imagine we have a mixed-up deck, and we want to arrange the cards in order. We pick one card as a "pivot" (like a reference card) and put all smaller cards on one side. Put larger cards on the other side. Now, we need to do the same for the two smaller groups of cards. Keep doing this until all groups have just one card. We have sorted the deck! It's fast because we're only comparing and moving cards a few times, not checking each card against all the others like some other methods. That's QuickSortβsorting by dividing and conquering!
Working of QuickSort
As we already know, quicksort is used to sort the elements of an array in ascending or descending order, but then how does it implement the divide and conquer principle?
In quicksort, an element is chosen as the pivot. Comparisons with it divide the array into two partitions, where the left side of the pivot has elements smaller than it while the right side has elements larger than it.
Quicksort has then applied to both the partitions again, and the process is continued till the array is sorted.
Now that we know how quicksort works, you are probably thinking, how do we choose the pivot?
There are four ways to choose the pivot:
- Consider the first element as the pivot.
- Consider the last element as the pivot.
- Consider any random element as the pivot.
- Use the Median of 3 methods.
Let us now consider a quicksort example and understand the four different methods.
First Element as the Pivot
After choosing a pivot, our next step is to swap the pivot with the last element and assign two counters for loops, i and j, as shown below:
We mentioned before that i and j are counters for loops, but for what loops?
In quicksort, we will keep on incrementing i till we find an element larger than the pivot, and we will keep on decrementing j till we find an element smaller than the pivot.
When we find two such elements, we will swap them.
To understand this, let us consider a quicksort illustration.
Here, i is not greater than the pivot, but j is less than the pivot, so we will increase i till our condition is met.
Finally, i is greater than the pivot, so our next step would be to swap the elements in positions i and j.
We will increment and decrement, i and j, respectively, till i > j.
Now that i has crossed j, we will swap the element in the i^{th} position and the pivot.
Here, we can see that the elements to the left of the pivot are less than it, while the elements to the right of the pivot are greater than it. So we have got two partitions and the sorted position of the pivot (8).
Now, we will again apply the quicksort to the partitions. But, the right partition has only one element, so weβll use a quicksort to the left partition only.
Last Element as the Pivot
In the left partition, we can see that 0 is the smallest element, so it is already in its sorted position. Therefore, considering 0 as the pivot would be a waste of time, so letβs use the last element as the pivot here. We will also initialise the counters i and j.
We already know how the counters are used in quicksort, but letβs see it once more for the sake of our understanding.
Here, again two partitions are formed where the elements to the left of the pivot are smaller than it, while the elements to the right are larger.
Since we can see that the elements in the left partition are already sorted, we will ignore it. A computer program will, however, sort it again using quicksort.
Random Element as the Pivot
To sort the right partition we got, let us use the third method of considering the pivot.
Instead of any random element, let us choose the element at the middle index as the pivot.
Here, the middle element is 3. So, we will consider that to be the pivot.
Now that we have determined the pivot, we will continue the process as we have before.
Thus, the array is now like this.
The last three elements left can quickly be sorted by choosing the pivot following any of the methods mentioned above to give the sorted array as follows:
The Median of 3 Method
There is another method to choose the pivot element, the median of 3 methods.
Previously, we had seen that we might encounter a situation where the element weβre choosing as a pivot is already in its sorted position. So, considering that element as the pivot would unnecessarily increase the complexity of the algorithm.
To avoid this, the median of 3 methods is used to choose the pivot.
Let us consider the same array as an example to understand how this method works.
Here, the first element is 8, the last element is 0, and the median element (0+9/2 β 4) is 6.
In the median of 3 rule, we will first sort these three elements in ascending order and place them in the first, median and last positions, respectively, as follows:
As shown below, we will swap the element in the median position, i.e., 6, with the element in the second to the last position. The element in the second to the last position will now be our pivot.
Our next step is to assign the counters. i and j will be assigned a little differently from how we were doing it before, but why?
The reason is, we already sorted the first, last and the median (now the pivot) element before. So, the first and the last elements are already in the correct partition.
Therefore i and j will be assigned as follows:
Therefore, the array is now as follows:
A new pivot is obtained for the left and the right partitions, and the process of quicksort continues to sort the entire array in ascending order.
Algorithm of QuickSort
Now that we know how quicksort works, we can easily figure out its algorithm.
Step 1: Choose the pivot following any of the four methods (the last element is commonly used as the pivot).
Step 2: Assign the counter i to the leftmost elementβs index and the counter j to the rightmost elementβs index, excluding the counter.
Step 3: If i is less than j, proceed.
Step 4: If the element in position i is less than the pivot, increment i.
Step 5: If the element in position j is greater than the pivot, decrement j.
Step 6: If the element in i is greater than the pivot and the element in position j is less than the pivot, swap the elements in i and j.
Step 7: If i is greater than j, swap the element in position i and the pivot.
Step 8: Continue the process for the left and the right partitions formed.
With this, we have completed the partition.
Now, for the quicksort part,
Step 1: Declare a function with three parameters, an array (say arr) and two integer type variables (say i and j).
Step 2: If arr[i] < arr[j], partition the array between arr[i] and arr[j].
Step 3: Recursively call the function with three parameters, an array (arr) and two integers (i and (partition position - 1)).
Step 4: Recursively call the function with three parameters, an array (arr) and two integers ((partition position + 1), j).
Knowing the algorithm, we can easily use the syntax of different programming languages to write the code for quicksort in C, quicksort in Java or quicksort in Python.
Code implementation of the Quick Sort
The complete code for quicksort, considering the last element as the pivot (in C++), quicksort below:
Implementation in C++
Output
Implementation in C
Output
Implementation in Java
Output
Implementation in Python
Output
QuickSort Time Complexity Analysis
Now that we know about quicksort, our next job would be to analyse the time complexity.
To do that, let us again consider an array.
Best Case Complexity
Let us consider that each time, the partitioning occurs from the middle. We can visualise it like this:
Here, the number of iterations required to partition the array always sums up to n for each level.
To calculate the number of levels, we have to determine the number of times we divide the size of the initial array by two until we get 1.
Therefore,
n / 2^{k} = 1
=> n = 2^{k}
=> log_{2}(n) = log_{2}(2^{k})
=> k = log_{2}(n)
Hence, the total time complexity will be a function of nlogn i.e. O(nlogn)
Now, most of the time, the partitioning is not precisely from the middle. This is the ideal case, and so it gives the best case complexity.
Average Case Complexity
For average-case complexity, let us consider the partitioning to be in a 3:1 ratio.
The total number of iterations for partitioning in each level will again be n.
For the number of levels, we have to determine the number of times we divide the size of the initial array by four till we get 1.
Therefore,
n / 4^{k} = 1
=> n = 4^{k}
=> log_{4}(n) = log_{4}(4^{k})
=> k = log_{4}(n)
Hence the average case time complexity will again be O(nlogn).
Worst Case Complexity
In the worst-case scenario, we will be sorting an array that is already sorted. So, the partitioning diagram will be as follows:
Therefore, for large values of n, the time complexity will be,
n + (n β 1) + (n β 2) + (n β 3) + β¦ + 2 + 1 = n(n β 1) / 2 = (n^{2} β n) / 2
Hence, the worst-case complexity is O(n^{2}).
If youβve come this far, donβt quit now. Try out QuickSort on Coding Ninjas Studio and get it accepted right away.
Advantages of Merge Sort Over QuickSort
Like quicksort, merge sort is another divide and conquer algorithm following a similar method to sort arrays.
Then why do we need two different algorithms?
To understand this, let us compare and contrast the two algorithms.
Property | QuickSort | Merge Sort |
---|---|---|
Partition of elements | Partitions are made in any ratio. | Partition is made into two halves. |
Worst-case complexity | O(n^{2}) | O(n.log n) |
Suitability | It works well on small arrays. | It works well on any array, irrespective of the size. |
Speed | Works comparatively faster for small data sets. | It has a consistent speed for all sizes of data sets. |
Storage Requirements | Less | More |
Efficiency | Inefficient for large data sets. | Comparatively more efficient. |
Sorting Method | Internal | External |
Stability | Not stable | Stable |
Preferred Use | Arrays | Linked Lists |
Therefore, we can conclude that merge sort is often more helpful than quicksort, especially for large data sets.
Advantages of QuickSort
There are several advantages of the QuickSort algorithm:
- It is generally faster than many other sorting algorithms, especially for large datasets. It has an average and best-case time complexity of O(n log n), making it efficient for most scenarios
- It is an in-place sorting algorithm, which means it doesn't require additional memory for sorting. It sorts the elements within the original array, making it memory-efficient
- It exhibits good performance on average, especially when the pivot selection and partitioning methods are optimized
- It's one of the most widely used sorting algorithms and is often the default sorting algorithm in programming libraries
Disadvantages of QuickSort
Along with the advantages, the QuickSort algorithm has some disadvantages as well:
- In the worst-case scenario, it can have a time complexity of O(n^2). This occurs when the pivot selection consistently leads to unbalanced partitions
- It is not a stable sorting algorithm. It may change the relative order of equal elements in the sorted array
- The choice of the pivot element significantly affects its performance. Poor pivot selection can lead to inefficient sorting
- It is less suitable for sorting linked lists compared to arrays because it relies on random access to elements
- It uses a recursive approach, which may lead to stack overflow errors for very large datasets
Frequently Asked Questions
How do you perform quicksort?
In quicksort, we first select a pivot, with which comparisons are made to make two partitions. The exact process is then repeated for both the partitions until the array is sorted.
What is meant by quicksort?
Quicksort is a divide and conquer algorithm used to sort the elements in an array.
Why is quicksort called Quick?
Quicksort is called quick because it is faster than other sorting algorithms when sorting small data sets.
Why is quicksort the best sorting algorithm?
QuickSort is favored for its efficiency, in-place sorting, and good average-case performance with O(n log n) time complexity. Its adaptability via pivot selection strategies helps mitigate worst-case scenarios. However, choosing the best sorting algorithm depends on specific requirements, and other algorithms may be more suitable in some cases.
Conclusion
In this article, we learned about quicksort, how it works and how to implement it, but this isnβt the end.
Keeping the theoretical knowledge at our fingertips helps us get about half the work done. To gain complete understanding, practice is a must. A variety of coding questions from interviews are available here.
Also check out - Strong Number
Apart from this, you can also check Coding Ninjas Studio, where youβll find thousands of other coding questions to practice and interview experiences of scholars in renowned product-based companies.
Happy learning!