0% found this document useful (0 votes)
228 views37 pages

Sorting Algorithms Overview and Analysis

The document discusses various sorting algorithms. It begins by introducing sorting and defining internal and external sorting techniques. Some common internal sorting algorithms discussed include bubble sort, selection sort, insertion sort, shell sort, quicksort, mergesort, and heapsort. For each algorithm, the document outlines the sorting approach, provides pseudocode for the algorithm, and discusses time complexity. External sorting is also introduced. The document provides a detailed overview of different sorting techniques and algorithms.

Uploaded by

Reeya Ramudamu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
228 views37 pages

Sorting Algorithms Overview and Analysis

The document discusses various sorting algorithms. It begins by introducing sorting and defining internal and external sorting techniques. Some common internal sorting algorithms discussed include bubble sort, selection sort, insertion sort, shell sort, quicksort, mergesort, and heapsort. For each algorithm, the document outlines the sorting approach, provides pseudocode for the algorithm, and discusses time complexity. External sorting is also introduced. The document provides a detailed overview of different sorting techniques and algorithms.

Uploaded by

Reeya Ramudamu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Table of Contents

Unit 8 Sorting .......................................................................................................................................... 1


8.1 Introduction .................................................................................................................................. 1
8.2 Types of Sorting Techniques: ........................................................................................................ 1
8.2.1 External Sorting ...................................................................................................................... 1
8.3 Bubble Sort.................................................................................................................................... 2
8.3.1 Algorithm ............................................................................................................................... 2
8.3.2 Working of Bubble Sort .......................................................................................................... 2
8.3.3 Bubble Sort Complexity.......................................................................................................... 4
8.3.4 Optimized Bubble Sort Algorithm .......................................................................................... 4
8.4 Quick Sort ...................................................................................................................................... 5
8.4.1 Choosing the pivot ................................................................................................................. 6
8.4.2 Algorithm ............................................................................................................................... 6
8.4.3 Working of Quick Sort Algorithm ........................................................................................... 6
8.4.4 Quicksort Complexity ............................................................................................................. 9
8.5 Merge Sort .................................................................................................................................. 10
8.5.1 Algorithm ............................................................................................................................. 10
8.5.2 Merge sort complexity ......................................................................................................... 14
8.6 Selection Sort .............................................................................................................................. 15
8.6.1 Algorithm ............................................................................................................................. 15
8.6.2 Working of Selection Sort Algorithm ................................................................................... 16
8.6.3 Selection Sort Complexity .................................................................................................... 17
8.7 Insertion Sort .............................................................................................................................. 18
8.7.1 Algorithm ............................................................................................................................. 18
8.7.2 Working of Insertion Sort..................................................................................................... 18
8.7.3 Insertion Sort Complexity .................................................................................................... 21
8.8 Shell Sort ..................................................................................................................................... 22
8.8.1 Algorithm ............................................................................................................................. 22
8.8.2 Working of Shell Sort Algorithm .......................................................................................... 23
8.8.3 Shell Sort Complexity ........................................................................................................... 25
8.9 Radix Sort .................................................................................................................................... 26
8.9.1 Algorithm ............................................................................................................................. 26
8.9.2 Working of Radix Sort Algorithm ......................................................................................... 26
8.9.3 Radix Sort Complexity .......................................................................................................... 28
8.10 Heap Sort as Priority Queue...................................................................................................... 29
8.10.1 Algorithm ........................................................................................................................... 29
8.10.2 Working of Heap Sort Algorithm........................................................................................ 30
8.10.3 Heap Sort Complexity ........................................................................................................ 34
8.11 Binary Sort................................................................................................................................. 35
8.12 Exchange Sort............................................................................................................................ 35

Narayan Sapkota, [Link]. i


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Unit 8 Sorting
8.1 Introduction
Sorting is a technique of organizing the data either in ascending or descending order i.e., bringing
some order lines in the data. Sort methods are very important in data structures.

Sorting can be performed on any one or combination of one or more attributes present in each record.
It is very easy and efficient to perform searching if data is stored in sorting order. The sorting is
performed according to the key value of each record. Depending up on the makeup of key, records
can be stored either numerically or alphanumerically. In numerical sorting, the records arranged in
ascending or descending order according to the numeric value of the key.

Let A be a list of n elements a1, a2, a3 … an in memory. Sorting A refers to the operation of rearranging
the contents of A so that they are increasing in order, that is, so that a1 <= a2 <=a3 <=...<=an. Since A
has n elements, there are n! ways that the contents can appear in A. These ways corresponding
precisely to the n! permutations of 1,2, 3…n. Accordingly each sorting algorithm must take care of
these n! possibilities.

Example: Suppose an array DATA contains 8 elements as DATA = [70, 30,40,10,80,20,60,50]

After sorting DATA must appear in memory as follows DATA= [ 10 20 30 40 50 60 70 80]


Since DATA consists of 8 elements, there are 8! = 40320 ways that the numbers
10,20,30,40,50,60,70,80 can appear in DATA.
The factors to be considered while choosing sorting techniques are:
• Programming Time
• Execution Time
• Number of Comparisons
• Memory Utilization
• Computational Complexity

8.2 Types of Sorting Techniques:


Sorting techniques are categorized into 2 types. They are internal sorting and external sorting.

Internal Sorting
Internal sorting method is used when small amount of data has to be sorted. In this method, the data
to be sorted is stored in the main memory (RAM). Internal sorting method can access records
randomly. For example, bubble sort, insertion sort, selection sort, shell sort, quick sort, radix sort,
heap sort etc.

External Sorting
External sorting method is used when large amount of data has to be sorted. In this method, the data
to be sorted is stored in the main memory as well as in the secondary memory such as disk. External
sorting methods an access records only in a sequential order. Ex: Merge Sort, Multi way Mage Sort.

Narayan Sapkota, [Link]. 1


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

8.3 Bubble Sort


Bubble sort works on the repeatedly swapping of adjacent elements until they are not in the intended
order. It is called bubble sort because the movement of array elements is just like the movement of
air bubbles in the water. Bubbles in water rise up to the surface; similarly, the array elements in bubble
sort move to the end in each iteration.

Although it is simple to use, it is primarily used as an educational tool because the performance of
bubble sort is poor in the real world. It is not suitable for large data sets. The average and worst-case
complexity of Bubble sort is O(n2), where n is a number of items. Bubble sort is majorly used where
• complexity does not matter
• simple and short code is preferred

Algorithm

In the algorithm given below, suppose arr is an array of n element. The assumed swap function in the
algorithm will swap the values of given array elements.
BubbleSort (arr)
{
for all array elements
if (arr[i] > arr[i+1])
swap(arr[i], arr[i+1])
return arr
}

Working of Bubble Sort

First Iteration (Compare and Swap)


• Starting from the first index, compare the first and the second elements.
• If the first element is greater than the second element, they are swapped.
• Compare the second and the third elements. Swap them if they are not in order.
• Repeat the above process until the last element.

Narayan Sapkota, [Link]. 2


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Remaining Iteration
The same process goes on for the remaining iterations. After each iteration, the largest element
among the unsorted elements is placed at the end. In each iteration, the comparison takes place up
to the last unsorted element.

Narayan Sapkota, [Link]. 3


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

The array is sorted when all the unsorted elements are placed at their correct positions.

Bubble Sort Complexity

Time Complexity

Case Time Complexity


Best Case O(n)
Average Case O(n2)
Worst Case O(n2)

Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already sorted.

Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly
ascending and not properly descending.

Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order.
That means suppose you have to sort the array elements in ascending order, but its elements are in
descending order.

Space Complexity
The space complexity of bubble sort is O(1) and it is stable. It is because, in bubble sort, an extra
variable is required for swapping. The space complexity of optimized bubble sort is O(2). It is because
two extra variables are required in optimized bubble sort.

Optimized Bubble Sort Algorithm

In the bubble sort algorithm, comparisons are made even when the array is already sorted. Because
of that, the execution time increases. To solve it, we can use an extra variable swapped. It is set
to true if swapping requires; otherwise, it is set to false. It will be helpful, as suppose after an iteration,
if there is no swapping required, the value of variable swapped will be false. It means that the
elements are already sorted, and no further iterations are required. This method will reduce the
execution time and also optimizes the bubble sort.

Narayan Sapkota, [Link]. 4


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Algorithm
bubbleSort(array)
n = length(array)
repeat
swapped = false
for i = 1 to n - 1
if array[i - 1] > array[i], then
swap(array[i - 1], array[i])
swapped = true
end if
end for
n=n-1
until not swapped
end bubbleSort

8.4 Quick Sort


The quick sort algorithm follows the principal of divide and conquer. Divide and conquer is a technique
of breaking down the algorithms into subproblems, then solving the subproblems, and combining the
results back together to solve the original problem.

Divide: In divide, first pick a pivot element. After that, partition or rearrange the array into two sub-
arrays such that each element in the left sub-array is less than or equal to the pivot element and each
element in the right sub-array is larger than the pivot element.

Conquer: Recursively, sort two subarrays with quicksort.

Combine: Combine the already sorted array.

Quicksort picks an element as pivot, and then it partitions the given array around the picked pivot
element. In quick sort, a large array is divided into two arrays in which one holds values that are smaller
than the specified value (Pivot), and another array holds the values that are greater than the pivot.

After that, left and right sub-arrays are also partitioned using the same approach. It will continue until
the single element remains in the sub-array.

Narayan Sapkota, [Link]. 5


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Choosing the pivot

Picking a good pivot is necessary for the fast implementation of quicksort. However, it is typical to
determine a good pivot. Some of the ways of choosing a pivot are as follows -
• Pivot can be random, i.e. select the random pivot from the given array.
• Pivot can either be the rightmost element or the leftmost element of the given array.
• Select median as the pivot element.

Algorithm

QUICKSORT (array A, start, end)


{
if (start < end)
{
p = partition(A, start, end)
QUICKSORT (A, start, p - 1)
QUICKSORT (A, p + 1, end)
}
}
Partition Algorithm
The partition algorithm rearranges the sub-arrays in a place.

PARTITION (array A, start, end)


{
pivot = A[end]
i = start-1
for j = start to end -1 {
do if (A[j] < pivot) {
then i = i + 1
swap A[i] with A[j]
}}
swap A[i+1] with A[end]
return i+1
}

Working of Quick Sort Algorithm

Let the elements of array are -

In the given array, we consider the leftmost element as pivot. So, in this case, a[left] = 24, a[right] =
27 and a[pivot] = 24. Since, pivot is at left, so algorithm starts from right and move towards left.

Narayan Sapkota, [Link]. 6


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Now, a[pivot] < a[right], so algorithm moves forward one position towards left, i.e. –

Now, a[left] = 24, a[right] = 19, and a[pivot] = 24. Because, a[pivot] > a[right], so, algorithm will swap
a[pivot] with a[right], and pivot moves to right, as

Now, a[left] = 19, a[right] = 24, and a[pivot] = 24. Since, pivot is at right, so algorithm starts from left
and moves to right. As a[pivot] > a[left], so algorithm moves one position to right as

Now, a[left] = 9, a[right] = 24, and a[pivot] = 24. As a[pivot] > a[left], so algorithm moves one position
to right as -

Narayan Sapkota, [Link]. 7


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Now, a[left] = 29, a[right] = 24, and a[pivot] = 24. As a[pivot] < a[left], so, swap a[pivot] and a[left],
now pivot is at left, i.e. –

Since, pivot is at left, so algorithm starts from right, and move to left. Now, a[left] = 24, a[right] = 29,
and a[pivot] = 24. As a[pivot] < a[right], so algorithm moves one position to left, as –

Now, a[pivot] = 24, a[left] = 24, and a[right] = 14. As a[pivot] > a[right], so, swap a[pivot] and a[right],
now pivot is at right, i.e. –

Now, a[pivot] = 24, a[left] = 14, and a[right] = 24. Pivot is at right, so the algorithm starts from left and
move to right.

Now, a[pivot] = 24, a[left] = 24, and a[right] = 24. So, pivot, left and right are pointing the same
element. It represents the termination of procedure. Element 24, which is the pivot element is placed
at its exact position. Elements that are right side of element 24 are greater than it, and the elements
that are left side of element 24 are smaller than it.

Narayan Sapkota, [Link]. 8


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Now, in a similar manner, quick sort algorithm is separately applied to the left and right sub-arrays.
After sorting gets done, the array will be –

Quicksort Complexity

Time Complexity

Case Time Complexity


Best Case O(n*log n)
Average Case O(n*log n)
Worst Case O(n2)

Best Case Complexity - In Quicksort, the best-case occurs when the pivot element is the middle
element or near to the middle element. The best-case time complexity of quicksort is O (n*log n).

Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly
ascending and not properly descending. The average case time complexity of quicksort is O (n*log n).

Worst Case Complexity - In quick sort, worst case occurs when the pivot element is either greatest or
smallest element. Suppose, if the pivot element is always the last element of the array, the worst case
would occur when the given array is sorted already in ascending or descending order. The worst-case
time complexity of quicksort is O(n2). Though the worst-case complexity of quicksort is more than
other sorting algorithms such as Merge sort and Heap sort, still it is faster in practice. Worst case in
quick sort rarely occurs because by changing the choice of pivot, it can be implemented in different
ways. Worst case in quicksort can be avoided by choosing the right pivot element.

Space Complexity
The space complexity of quicksort is O(n*log n) and its not stable.

Advantages of Quick Sort


• This is fastest sorting technique among all.
• Its efficiency is also relatively good.
• It requires small amount of memory

Disadvantages
• It is somewhat complex method for sorting.
• It is little hard to implement than other sorting methods
• It does not perform well in the case of small group of elements.

Narayan Sapkota, [Link]. 9


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

8.5 Merge Sort


Merge sort is similar to the quick sort algorithm as it uses the divide and conquer approach to sort the
elements. It is one of the most popular and efficient sorting algorithms. It divides the given list into
two equal halves, calls itself for the two halves and then merges the two sorted halves. We have to
define the merge () function to perform the merging.

The sub-lists are divided again and again into halves until the list cannot be divided further. Then we
combine the pair of one element lists into two-element lists, sorting them in the process. The sorted
two-element pairs are merged into the four-element lists, and so on until we get the sorted list.

Suppose we had to sort an array A. A subproblem would be to sort a sub-section of this array starting
at index p and ending at index r, denoted as A[p..r].

Divide
If q is the half-way point between p and r, then we can split the subarray A[p..r] into two
arrays A[p..q] and A[q+1, r].

Conquer
In the conquer step, we try to sort both the subarrays A[p..q] and A[q+1, r]. If we haven't yet reached
the base case, we again divide both these subarrays and try to sort them.

Combine
When the conquer step reaches the base step and we get two sorted subarrays A[p..q] and A[q+1,
r] for array A[p..r], we combine the results by creating a sorted array A[p..r] from two sorted
subarrays A[p..q] and A[q+1, r].

Algorithm

In the following algorithm, arr is the given array, beg is the starting element, and end is the last
element of the array.

MERGE_SORT(arr, beg, end)


if beg < end
set mid = (beg + end)/2
MERGE_SORT(arr, beg, mid)
MERGE_SORT(arr, mid + 1, end)
MERGE (arr, beg, mid, end)
end of if
END MERGE_SORT

The important part of the merge sort is the MERGE function. This function performs the merging of
two sorted sub-arrays that are A[beg…mid] and A[mid+1…end], to build one sorted
array A[beg…end]. So, the inputs of the MERGE function are A[], beg, mid, and end.

Merge Function Working Step by Step


A lot is happening in this function, so let's take an example to see how this would work.

Narayan Sapkota, [Link]. 10


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Merging two consecutive subarrays of array

The array A[0..5] contains two sorted subarrays A[0..3] and A[4..5]. Let us see how the merge function
will merge the two arrays.

Step 1: Create duplicate copies of sub-arrays to be sorted.


L ← A[p..q] and M ← A[q+1..r]

Step 2: Maintain current index of sub-arrays and main array.

Step 3: Until we reach the end of either L or M, pick larger among elements L and M and place them
in the correct position at A[p..r]

Narayan Sapkota, [Link]. 11


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Comparing individual elements of sorted subarrays until we reach end of one

Step 4: When we run out of elements in either L or M, pick up the remaining elements and put in
A[p..r]

Narayan Sapkota, [Link]. 12


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Copy the remaining elements from the first array to main subarray

Copy remaining elements of second array to main subarray

This step would have been needed if the size of M was greater than L. At the end of the merge function,
the subarray A[p..r] is sorted.

Narayan Sapkota, [Link]. 13


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Merge sort complexity

Time Complexity

Case Time Complexity


Best Case O(n*logn)
Average Case O(n*logn)
Worst Case O(n*logn)

Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already sorted. The
best-case time complexity of merge sort is O(n*logn).

Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly
ascending and not properly descending. The average case time complexity of merge sort is O(n*logn).

Narayan Sapkota, [Link]. 14


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order.
That means suppose you have to sort the array elements in ascending order, but its elements are in
descending order. The worst-case time complexity of merge sort is O(n*logn).

Space Complexity
The space complexity of merge sort is O(n). It is because, in merge sort, an extra variable is required
for swapping. It is also stable.

Advantages
• Merge sort is stable sort.
• It is easy to understand.
• It gives better performance.

Disadvantages
• It requires extra memory space.
• Copy of elements to temporary array.
• It requires additional array.
• It is slow process.

8.6 Selection Sort


In selection sort, the smallest value among the unsorted elements of the array is selected in every
pass and inserted to its appropriate position into the array. It is also the simplest algorithm. It is an in-
place comparison sorting algorithm. In this algorithm, the array is divided into two parts, first is sorted
part, and another one is the unsorted part. Initially, the sorted part of the array is empty, and unsorted
part is the given array. Sorted part is placed at the left, while the unsorted part is placed at the right.

In selection sort, the first smallest element is selected from the unsorted array and placed at the first
position. After that second smallest element is selected and placed in the second position. The process
continues until the array is entirely sorted.

The average and worst-case complexity of selection sort is O(n2), where n is the number of items. Due
to this, it is not suitable for large data sets. Selection sort is generally used when

• A small array is to be sorted


• Swapping cost doesn't matter
• It is compulsory to check all elements

Algorithm

SELECTION SORT(arr, n)
Step 1: Repeat Steps 2 and 3 for i = 0 to n-1
Step 2: Call SMALLEST(arr, i, n, pos)
Step 3: Swap arr[i] with arr[pos]
[END OF LOOP]
Step 4: EXIT

Narayan Sapkota, [Link]. 15


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

SMALLEST (arr, i, n, pos)


Step 1: [INITIALIZE] SET SMALL = arr[i]
Step 2: [INITIALIZE] SET pos = i
Step 3: Repeat for j = i+1 to n
if (SMALL > arr[j])
SET SMALL = arr[j]
SET pos = j
[END OF if]
[END OF LOOP]
Step 4: RETURN pos

Working of Selection Sort Algorithm

Now, for the first position in the sorted array, the entire array is to be scanned sequentially. At
present, 12 is stored at the first position, after searching the entire array, it is found that 8 is the
smallest value.

So, swap 12 with 8. After the first iteration, 8 will appear at the first position in the sorted array.

For the second position, where 29 is stored presently, we again sequentially scan the rest of the items
of unsorted array. After scanning, we find that 12 is the second lowest element in the array that should
be appeared at second position.

Now, swap 29 with 12. After the second iteration, 12 will appear at the second position in the sorted
array. So, after two iterations, the two smallest values are placed at the beginning in a sorted way.

The same process is applied to the rest of the array elements. Now, we are showing a pictorial
representation of the entire sorting process.

Narayan Sapkota, [Link]. 16


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Selection Sort Complexity

Time Complexity

Case Time Complexity


Best Case O(n2)
Average Case O(n2)
Worst Case O(n2)

Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already sorted. The
best-case time complexity of selection sort is O(n2).

Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly
ascending and not properly descending. The average case time complexity of selection sort is O(n2).

Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order.
That means suppose you have to sort the array elements in ascending order, but its elements are in
descending order. The worst-case time complexity of selection sort is O(n2).

Narayan Sapkota, [Link]. 17


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Space Complexity
The space complexity of selection sort is O (1). It is because, in selection sort, an extra variable is
required for swapping. It is stable also.

8.7 Insertion Sort


Insertion sort works similar to the sorting of playing cards in hands. It is assumed that the first card is
already sorted in the card game, and then we select an unsorted card. If the selected unsorted card is
greater than the first card, it will be placed at the right side; otherwise, it will be placed at the left side.
Similarly, all unsorted cards are taken and put in their exact place.

The same approach is applied in insertion sort. The idea behind the insertion sort is that first take one
element, iterate it through the sorted array. Although it is simple to use, it is not appropriate for large
data sets as the time complexity of insertion sort in the average case and worst case is O(n2), where n
is the number of items. Insertion sort is less efficient than the other sorting algorithms like heap sort,
quick sort, merge sort, etc.

Insertion sort is twice as fast as bubble sort. In insertion sort the elements comparisons are as less as
compared to bubble sort. In this comparison the value until all prior elements are less than the
compared values are not found. This means that all the previous values are lesser than compared
value. Insertion sort is good choice for small values and for nearly sorted values.

Algorithm

Insertion_sort (ARR, SIZE)


i=1;
while(i<SIZE)
temp=ARR[i]
J=i=1;
While(Temp<=ARR[j] and j>=0)
ARR[j+1]=ARR[i]
j=j-1
End While
ARR(j+1)=Temp;
Print ARR after ith pass
i=i+1
End while
print no. of passes i-1

Working of Insertion Sort

Narayan Sapkota, [Link]. 18


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

The first element in the array is assumed to be sorted. Take the second element and store it separately
in key. Compare key with the first element. If the first element is greater than key, then key is placed
in front of the first element.

If the first element is greater than key, then key is placed in front of the first element.

Now, the first two elements are sorted. Take the third element and compare it with the elements on
the left of it. Placed it just behind the element smaller than it. If there is no element smaller than it,
then place it at the beginning of the array.

Place 1 at the beginning

Similarly, place every unsorted element at its correct position.

Narayan Sapkota, [Link]. 19


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Narayan Sapkota, [Link]. 20


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Insertion Sort Complexity

Time Complexity

Case Time Complexity


Best Case O(n)
Average Case O(n2)
Worst Case O(n2)

Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already sorted. The
best-case time complexity of insertion sort is O(n).

Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly
ascending and not properly descending. The average case time complexity of insertion sort is O(n2).

Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order.
That means suppose you have to sort the array elements in ascending order, but its elements are in
descending order. The worst-case time complexity of insertion sort is O(n2).
Space Complexity
The space complexity of insertion sort is O(1) and stable. It is because, in insertion sort, an extra
variable is required for swapping.

Narayan Sapkota, [Link]. 21


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Advantages of Insertion Sort


• It is simple sorting algorithm, in which the elements are sorted by considering one item at a
time. The implementation is simple.
• It is efficient for smaller data set and for data set that has been substantially sorted before.
• It does not change the relative order of elements with equal keys.
• It reduces unnecessary travels through the array.
• It requires constant amount of extra memory space.

Disadvantages
• It is less efficient on list containing more number of elements.
• As the number of elements increases the performance of program would be slow

8.8 Shell Sort

Shell sort is the generalization of insertion sort, which overcomes the drawbacks of insertion sort by
comparing elements separated by a gap of several positions.

It is a sorting algorithm that is an extended version of insertion sort. Shell sort has improved the
average time complexity of insertion sort. As similar to insertion sort, it is a comparison-based and in-
place sorting algorithm. Shell sort is efficient for medium-sized data sets.

In insertion sort, at a time, elements can be moved ahead by one position only. To move an element
to a far-away position, many movements are required that increase the algorithm's execution time.
But shell sort overcomes this drawback of insertion sort. It allows the movement and swapping of far-
away elements as well.

This algorithm first sorts the elements that are far away from each other, then it subsequently reduces
the gap between them. This gap is called as interval.

Interval calculation in Shell Sort


Interval in shell sort can be reduced by using the optimal sequence. Some optimal sequences that can
be used in shell sort are:
• Shell’s original sequence: N/2, N/4, …. 1
• Knuth’s Sequence: 1, 4, 13, ….., (3n – 1) / 2
• Sedgwick’s sequence: 1,8,23,77,281,1073,4193,16577… 4j+1+3.2j+1
• Hibbard’s sequence: 1,3,7,15,31,63,127,255,511….
• Papernov & Stasevich sequence: 1,3,5,9,17,33,65,….
• Pratt: 1,2,3,4,6,9,8,12,18,27,16,24,36,54,81,…

The performance of the shell sort depends on the type of sequence used for a given input array.

Algorithm

ShellSort(a, n) // 'a' is the given array, 'n' is the size of array


for (interval = n/2; interval > 0; interval /= 2)

Narayan Sapkota, [Link]. 22


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

for ( i = interval; i < n; i += 1)


temp = a[i];
for (j = i; j >= interval && a[j - interval] > temp; j -= interval)
a[j] = a[j - interval];
a[j] = temp;
End ShellSort

Working of Shell Sort Algorithm

Let the elements of array are -

We will use the original sequence of shell sort, i.e., N/2, N/4,....,1 as the intervals. In the first loop, n is
equal to 8 (size of the array), so the elements are lying at the interval of 4 (n/2 = 4). Elements will be
compared and swapped if they are not in order.

Here, in the first loop, the element at the 0th position will be compared with the element at
4th position. If the 0th element is greater, it will be swapped with the element at 4th position.
Otherwise, it remains the same. This process will continue for the remaining elements.

At the interval of 4, the sub lists are {33, 12}, {31, 17}, {40, 25}, {8, 42}.

Now, we have to compare the values in every sub-list. After comparing, we have to swap them if
required in the original array. After comparing and swapping, the updated array will look as follows -

Narayan Sapkota, [Link]. 23


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

In the second loop, elements are lying at the interval of 2 (n/4 = 2), where n = 8. Now, we are taking
the interval of 2 to sort the rest of the array. With an interval of 2, two sublists will be generated - {12,
25, 33, 40}, and {17, 8, 31, 42}.

Now, we again have to compare the values in every sub-list. After comparing, we have to swap them
if required in the original array. After comparing and swapping, the updated array will look as follows
-

In the third loop, elements are lying at the interval of 1 (n/8 = 1), where n = 8. At last, we use the
interval of value 1 to sort the rest of the array elements. In this step, shell sort uses insertion sort to
sort the array elements.

Narayan Sapkota, [Link]. 24


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Shell Sort Complexity

Time Complexity

Case Time Complexity


Best Case O(n*logn)
Average Case O(n*log(n)2)
Worst Case O(n2)

Best Case Complexity - It occurs when there is no sorting required, i.e., the array is already sorted.
The best-case time complexity of Shell sort is O(n*logn).

Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly
ascending and not properly descending. The average case time complexity of Shell sort is O(n*log(n)2).

Narayan Sapkota, [Link]. 25


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order.
That means suppose you have to sort the array elements in ascending order, but its elements are in
descending order. The worst-case time complexity of Shell sort is O(n2).

Space Complexity
The space complexity of Shell sort is O(1) and its unstable.

8.9 Radix Sort


Radix sort is the linear sorting algorithm that is used for integers. In Radix sort, there is digit by digit
sorting is performed that is started from the least significant digit to the most significant digit.

The process of radix sort works similar to the sorting of student's names, according to the alphabetical
order. In this case, there are 26 radix formed due to the 26 alphabets in English. In the first pass, the
names of students are grouped according to the ascending order of the first letter of their names.
After that, in the second pass, their names are grouped according to the ascending order of the second
letter of their name. And the process continues until we find the sorted list.

Algorithm

radixSort(arr)
max = largest element in the given array
d = number of digits in the largest element (or, max)
Now, create d buckets of size 0 - 9
for i -> 0 to d
sort the array elements using counting sort (or any stable sort) according to the digits at
the ith place

Working of Radix Sort Algorithm

First, we have to find the largest element (suppose max) from the given array. Suppose 'x' be the
number of digits in max. The 'x' is calculated because we need to go through the significant places of
all elements. After that, go through one by one each significant place. Here, we have to use any stable
sorting algorithm to sort the digits of each significant place.

In the given array, the largest element is 736 that have 3 digits in it. So, the loop will run up to three
times (i.e., to the hundreds place). That means three passes are required to sort the array. Now, first
sort the elements on the basis of unit place digits (i.e., x = 0). Here, we are using the counting sort
algorithm to sort the elements.

Pass 1:
In the first pass, the list is sorted on the basis of the digits at 0's place.

Narayan Sapkota, [Link]. 26


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

After the first pass, the array elements are –

Pass 2:
In this pass, the list is sorted on the basis of the next significant digits (i.e., digits at 10th place).

After the second pass, the array elements are -

Narayan Sapkota, [Link]. 27


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Pass 3:
In this pass, the list is sorted on the basis of the next significant digits (i.e., digits at 100th place).

After the third pass, the array elements are –

Radix Sort Complexity

Time Complexity

Case Time Complexity


Best Case Ω(n+k)
Average Case θ(nk)
Worst Case O(nk)

Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already sorted. The
best-case time complexity of Radix sort is Ω (n+k).

Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly
ascending and not properly descending. The average case time complexity of Radix sort is θ(nk).

Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order.
That means suppose you have to sort the array elements in ascending order, but its elements are in
descending order. The worst-case time complexity of Radix sort is O(nk).

Narayan Sapkota, [Link]. 28


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Radix sort is a non-comparative sorting algorithm that is better than the comparative sorting
algorithms. It has linear time complexity that is better than the comparative algorithms with
complexity O (n logn).

Space Complexity
The space complexity of Radix sort is O (n + k) and is stable.

8.10 Heap Sort as Priority Queue


Heap sort processes the elements by creating the min-heap or max-heap using the elements of the
given array. Min-heap or max-heap represents the ordering of array in which the root element
represents the minimum or maximum element of the array.

Heap sort basically recursively performs two main operations -


• Build a heap H, using the elements of array.
• Repeatedly delete the root element of the heap formed in 1st phase.
Before knowing more about the heap sort, let's first see a brief description of Heap.

What is a heap?
A heap is a complete binary tree, and the binary tree is a tree in which the node can have the utmost
two children. A complete binary tree is a binary tree in which all the levels except the last level, i.e.,
leaf node, should be completely filled, and all the nodes should be left-justified.

What is heap sort?


Heapsort is a popular and efficient sorting algorithm. The concept of heap sort is to eliminate the
elements one by one from the heap part of the list, and then insert them into the sorted part of the
list. Heapsort is the in-place sorting algorithm.

Algorithm

HeapSort(arr)
BuildMaxHeap(arr)
for i = length(arr) to 2
swap arr[1] with arr[i]
heap_size[arr] = heap_size[arr] ? 1
MaxHeapify(arr,1)
End

BuildMaxHeap(arr)
heap_size(arr) = length(arr)
for i = length(arr)/2 to 1
MaxHeapify(arr,i)
End

MaxHeapify(arr,i)
L = left(i)
R = right(i)

Narayan Sapkota, [Link]. 29


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

if L ? heap_size[arr] and arr[L] > arr[i]


largest = L
else
largest = i
if R ? heap_size[arr] and arr[R] > arr[largest]
largest = R
if largest != i
swap arr[i] with arr[largest]
MaxHeapify(arr,largest)
End

Working of Heap Sort Algorithm

In heap sort, basically, there are two phases involved in the sorting of elements. By using the heap
sort algorithm, they are as follows -
• The first step includes the creation of a heap by adjusting the elements of the array.
• After the creation of heap, now remove the root element of the heap repeatedly by shifting
it to the end of the array, and then store the heap structure with the remaining elements.

First, we have to construct a heap from the given array and convert it into max heap.

After converting the given heap into max heap, the array elements are -

Narayan Sapkota, [Link]. 30


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Next, we have to delete the root element (89) from the max heap. To delete this node, we have to
swap it with the last node, i.e. (11). After deleting the root element, we again have to heapify it to
convert it into max heap.

After swapping the array element 89 with 11, and converting the heap into max-heap, the elements
of array are -

In the next step, again, we have to delete the root element (81) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (54). After deleting the root element, we again have
to heapify it to convert it into max heap.

After swapping the array element 81 with 54 and converting the heap into max-heap, the elements of
array are -

Narayan Sapkota, [Link]. 31


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

In the next step, we have to delete the root element (76) from the max heap again. To delete this
node, we have to swap it with the last node, i.e. (9). After deleting the root element, we again have
to heapify it to convert it into max heap.

After swapping the array element 76 with 9 and converting the heap into max-heap, the elements of
array are -

In the next step, again we have to delete the root element (54) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (14). After deleting the root element, we again have
to heapify it to convert it into max heap.

After swapping the array element 54 with 14 and converting the heap into max-heap, the elements of
array are -

Narayan Sapkota, [Link]. 32


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

In the next step, again we have to delete the root element (22) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (11). After deleting the root element, we again have
to heapify it to convert it into max heap.

After swapping the array element 22 with 11 and converting the heap into max-heap, the elements of
array are -

In the next step, again we have to delete the root element (14) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (9). After deleting the root element, we again have
to heapify it to convert it into max heap.

After swapping the array element 14 with 9 and converting the heap into max-heap, the elements of
array are -

In the next step, again we have to delete the root element (11) from the max heap. To delete this
node, we have to swap it with the last node, i.e. (9). After deleting the root element, we again have
to heapify it to convert it into max heap.

Narayan Sapkota, [Link]. 33


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

After swapping the array element 11 with 9, the elements of array are -

Now, heap has only one element left. After deleting it, heap will be empty.

After completion of sorting, the array elements are -

Heap Sort Complexity

Time Complexity

Case Time Complexity


Best Case O(n logn)
Average Case O(n log n)
Worst Case O(n log n)

Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already sorted. The
best-case time complexity of heap sort is O(n logn).

Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly
ascending and not properly descending. The average case time complexity of heap sort is O(n log n).

Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order.
That means suppose you have to sort the array elements in ascending order, but its elements are in
descending order. The worst-case time complexity of heap sort is O(n log n).

The time complexity of heap sort is O(n log n) in all three cases (best case, average case, and worst
case). The height of a complete binary tree having n elements is logn.

Narayan Sapkota, [Link]. 34


Narayan Sapkota, [Link].
9/24/22
Lecture Notes on Data Structures and Algorithms

Space Complexity
The space complexity of Heap sort is O(1) and it's not stable.

8.11 Binary Sort

8.12 Exchange Sort

Narayan Sapkota, [Link]. 35

You might also like