Quick Sort
Quick Sort is a sorting algorithm that uses divide-and-conquer
strategy to sort an array. It does so by selecting a pivot element and
then sorting values larger than it on one side and smaller to the
other side, and then it repeats those steps until the array is sorted.
It is useful for sorting big data sets.
It was invented by British computer scientist C.A.R. Hoare in 1960
The quick sort method can be summarized in three steps:
1. Choose one element in the list to be the pivot.
2. Partition the array using a partitioning algorithm, such that all the
elements smaller than the pivot element are on the left-hand side of
the pivot element and all the elements greater than the pivot
element are on the right-hand side of the pivot element.
3. Repeat this process recursively for both the subarrays until each
subarray contains one element.
Note:
The algorithm will work correctly no matter which element you
choose as the pivot.
The main idea is to find the “right” position for the pivot element P.
After each “pass”, the pivot element, P, should be “in place”.
Eventually, the elements are sorted since each pass puts at least
one element (i.e., P) into its final position.
How to select the pivot element?
There are many different ways of selecting pivots:
1. Always pick the first element as a pivot.
2. Always pick the last element as a pivot
3. Pick a random element as a pivot.
4. Pick the middle element as the pivot.
Time Complexity
Quick Sort is popular for its impressive average-case performance, making
it a popular choice for sorting large datasets.
Average Case: O(n*log(n))
Worst Case: O(n^2)
Best Case: O(n*log(n))
The average-case time complexity of Quicksort is O(n*log(n)), which is
quicker than Merge Sort, Bubble Sort, and other sorting algorithms.
However, the worst-case time complexity is O(n^2) when the pivot choice
consistently results in unbalanced partitions. To mitigate this, randomized
pivot selection is commonly used.
Implementation of quick Sort in Python
def quicksort(a,left,right):
if(left<right):
pos=partition(a,left,right)
quicksort(a,left,pos-1)
quicksort(a,pos+1,right)
def partition(a,left,right):
i=left
j=right-1
pivot=a[right]
while i<j:
while i<right and a[i]<pivot:
i+=1
while j>left and a[j]>pivot:
j-=1
if i<j:
a[i],a[j]=a[j],a[i]
if(a[i]>pivot):
a[i],a[right]=a[right],a[i]
return i
a=[9,4,6,1,5,3,7,2,8]
print("Original List : ",a)
quicksort(a,0,len(a)-1)
print("Sorted List : ",a)
Advantages of Quick Sort
Fast and efficient
Easy to implement
Efficient for large datasets
Quick sort is an in-place algorithm because it does not require extra
memory space for solving the problem, as it is swapping the
elements without using any extra memory space.
Disadvantages of Quick Sort
Being Unstable
Sensitive to the choice of the pivot
Recursive implementation can lead to stack overflow
Limited suitability for small datasets
Vulnerable to the worst case.
Merge Sort
Merge Sort is one of the most popular sorting algorithms that is based on
the principle of Divide and Conquer Algorithm.
Divide and Conquer Approach
Divide and conquer is a strategy or technique used to solve problems by
breaking them down into smaller, more manageable subproblems. The
idea is to divide the problem into smaller parts that are easier to solve,
then solve each part separately, and finally combine the solutions of the
smaller parts to find the solution to the original problem.
This strategy is used in a wide range of problems, such as sorting
algorithms (e.g. merge sort, quick sort), recursive algorithms (e.g. the
Fibonacci sequence, the Tower of Hanoi), and more complex problems,
such as the closest pair of points problem, the travelling salesman
problem and many more.
Divide and conquer algorithms generally have a time complexity of O(n
log n). It makes them quite efficient, but it also depends on the problem.
Working Principle of Merge Sort Algorithm
The principle working of merge sort is that
1. It divides the array into two equal halves and this process is
repeated with each half until each half is of size 1.
2. The array of size 1 is trivially sorted.
3. Now the two sorted arrays are combined into one big array. And this
is continued until all the elements are combined and the array is
sorted.
Example of Merge Sort
Merge Sort Algorithm
The merge sort function first checks the base case where the array has a
length of 1. If yes, then it returns the array. Otherwise, it divides the array
into two parts and calls merge sort recursively on both parts. Finally, it
merges both parts in sorted order and returns a sorted array.
Implementation of Merge Sort in Python
def mergeSort(nlist):
if len(nlist)>1:
mid = len(nlist)//2
left = nlist[:mid]
right = nlist[mid:]
mergeSort(left)
mergeSort(right)
i=j=k=0
while i < len(left) and j < len(right):
if left[i] < right[j]:
nlist[k]=left[i]
i=i+1
else:
nlist[k]=right[j]
j=j+1
k=k+1
while i < len(left):
nlist[k]=left[i]
i=i+1
k=k+1
while j < len(right):
nlist[k]=right[j]
j=j+1
k=k+1
nlist = [3,7,4,1,5,9,2,6,8]
print("Original List: ",nlist)
mergeSort(nlist)
print("Sorted List : ",nlist)
Advantages of Merge Sort
Merge sort can be applied to any file size.
It can sort large data sets easily because it has faster time
complexity.
Merge sort is suitable for non-contiguous data, for example, linked
list. Merge sort implementation in linked lists yields efficient results.
It is always fast even in worst case, its runtime is O(n log n).
Disadvantages of Merge Sort
Merge sort is not efficient if the array is already sorted. It has the
same time complexity for the best and worst-case
In the case of small datasets, it is slower than other sorting
techniques.
It requires additional memory equivalent to the size of the dataset.