Last Updated: Sep 22, 2023

# 6 Sorting Algorithms, Features and Functions

Yukti Kumari
1 upvote

## Introduction

To put the elements in a particular order, sorting algorithms are used. The order can be anything from alphabetical to numerical, which can be ascending to descending.

Consider a phone book directory, we may arrange the contacts on the basis of name in alphabetical order, we may also arrange it on the basis of age which is a numerical order. Almost all the list we get from the computer has some sort of sorting. There are some clever sorting that users can not see, but they are sorted with some sort of algorithm.

## Why are sorting algorithms important?

Sorting algorithms are important because we use them on a daily basis which includes the following:

• The way we arrange our books on the bookshelf

• The way our clothes are organized

• Dishes in the kitchen are arranged in some sort of order

• The way the rows are arranged at the time of morning prayers in the school

Sorting of all these things may include any sorting order as mentioned below:

• Arranging the books on the basis of chronology

• Arranging clothes on the basis of new to old

• Gathering for morning prayers and get arranged in the ascending order of height

## Classification of sorting algorithm

Sorting may be classified into different types. Some major sorting algorithms are:

Let’s explain them with the help of examples.

## Bubble Sort

In bubble sort, if the adjacent elements are in the wrong order, they are swapped continuously

until the correct order is achieved. The Disadvantage of using bubble sort is that it is quite slow.

For Example: Consider an unordered list [4, 6, 2, 1].

Pass 1

● 4 < 6 : no change [4, 6, 2, 1] ● Now move next 6 > 2 : swap the elements [4, 2, 6, 1]

● Now 6 > 1 : swap the elements [4, 2, 1, 6]

Pass 2

● 4 > 2 : swap the elements [2, 4, 1, 6]

● 4 > 1 : swap the elements [2, 1, 4, 6]

● 4 < 6 : no change is needed [2, 1, 4, 6]

Pass 3

● 2 > 1 : swap the elements [1, 2, 4, 6]

● 2 < 4 : no change is needed [1, 2, 4, 6]

● 4 < 6 : no change is needed [1, 2, 4, 6]

Pass 4

● 1 < 2 : no change is needed [1, 2, 4, 6]

● 2 < 4 : no change is needed [1, 2, 4, 6]

● 4 < 6 : no change is needed [1, 2, 4, 6]

### Bubble Sort Complexity

It is quite impractical and too slow. Hence, for a large set of data, this sorting algorithm is not useful.

## Selection Sort

Selection sort repeatedly finds the minimum element from an unsorted array and puts it at the beginning of the array. It is an in-place comparison-based sorting algorithm.

Two arrays are maintained in case of selection sort:

• The unsorted array
• Sorted array

Initially, the sorted array is an empty array and an unsorted array has all the elements. It generally follows the approach of selecting the smallest element from an unsorted array and that smallest element is placed at the leftmost which becomes the part of the sorted array finally.

Steps of Selection Sort:

• The first step is to iterate the complete array

• When we reach the end we will get to know the sorted element in the list

• Iterate that sorted element with the leftmost element in the unsorted list

• Now that leftmost element will be part of the sorted array and will not be included in the unsorted array in the next iteration

• Steps will be repeated until all the elements are not sorted

The following image is showing the selection sort in a better way:

### Selection Sort Complexity

Also Read - Selection Sort in C

## Insertion Sort

It is simple and easy to implement, but it does not have an outstanding performance though. It works on the principle of a sorted item with one item at a time. Basically in each iteration of this sorting, an item is taken from the array, and it is inserted at its correct position by comparing the element from its neighbor. The process is repeated until there is no more unsorted item on the list.

Steps of Insertion Sort:

• Leave the first element of the list, and move to the next element in the array.

• Now compare with all the elements in the sorted sub-list.

• Shift all the elements in the sorted sublist that are greater than the elements to be sorted.

• Insert the element at that position.

• Repeat all the steps until the list is sorted.

For Example:

Consider a list of items as 7, 4, 5, 2

Step 1: There is no element on the left side of 7 so leave the element as it is.

Step 2: Now, 7>4, so swap it. So the new list would be 4, 7, 5, 2.

Step 3: Now 7>5, so swap it as well. So the new list would be 4, 5, 7, 2.

Step 4: As 7>2, so swap it. The new list would be 2, 4, 5, and 7.

### Complexity of Insertion Sort

Importance of Insertion Sort:

• It is important for smaller data sets, but inefficient for large data sets.

• It has less space complexity, it requires a single addition to memory space.

• It has an overall complexity of O(n2).

## Quick Sort

It is a commonly used sorting algorithm. It follows the approach of divide and conquers and follows the following approach.

1. Takes two empty arrays in which,
a) First array stores the elements that are smaller than the pivot element.

b) Second array stores the elements that are larger than the pivot element.

2. Partitioning the array and swapping them in place.

Steps of Quick Sort:

• The basis of comparison would be an element that is a “pivot” element in this case.

• Take two pointers, start one pointer from the left and the other pointer from the right.

• When we have less value than the pivot element in the left pointer of the array, move it to the right by 1.

• When we have a larger value than the pivot element in the right pointer of the array, move it to left by 1.

• When we have a left pointer less than a right pointer, swap the values at these locations in the array.

• Move the left pointer to the right pointer by 1 and the right to the left by 1.

• If the left and right pointer does not meet, repeat the steps from 1.

Importance of Quick Sort:

• It is very useful for sorting arrays.

• Quick Sort is not a stable sorting algorithm.

• Its overall time complexity is O(nlogn).

• It is an in-place sorting algorithm.

## Merge Sort

It is a sorting algorithm that follows the divide-and-conquer methodology. It recursively breaks down a problem into two or more sub-problems. This recursion is continued until a solution is not found that can be solved easily. Now, these sub-problems are combined together to form the array.

Steps of Merge Sort:

• If the array contains only one element return from the array.

• Now divide the complete array into two equal halves, divide until it can not be divided
further.

• Merge the smaller list into a new list in sorted order.

Importance of Merge Sort:

• It is best used for sorting the linked list.

• It is a stable sorting algorithm.

• It has an overall complexity of O(nlogn).

• It has a space complexity of O(n).

## Heap Sort

It is a comparison-based sorting algorithm. It is also said to be the better version of the selection sort. Basically, the list is divided into sorted and unsorted arrays. And on a continuous basis, the unsorted list is shrunk and added to the sorted list. Heap sort first prepares a max heap. Then the first element is swapped with the last element.

Steps of Heap Sort:

• Call the heapify() that forms the heap from a list in O(n) operation.

• Swap the first element with the last element.

• Call the shiftDown() to shift the first new element to its appropriate position.

• Repeat the steps until the list becomes sorted.

### What is the main purpose of sorting?

Sorting is the process of arranging data into meaningful order so that you can analyze it more effectively.

### Which sorting algorithm is the fastest?

In practice, Quick Sort is usually the fastest sorting algorithm. Its performance is measured most of the time in O(N × log N).

### What is the stability of sorting algorithms?

A stable sorting algorithm maintains the relative order of the items with equal sort keys.

### What is the best case in sorting?

The best case is the function that performs the minimum number of steps on input data of n elements.