Quick Sort Algorithm in Data Structure
Table of Content:
Quick Sort is also based on the concept of Divide and Conquer, just like merge sort. But in quick sort all the heavy lifting(major work) is done while dividing the array into subarrays, while in case of merge sort, all the real work happens during merging the subarrays. In case of quick sort, the combine step does absolutely nothing.
It picks an element as pivot and partitions the given array around the picked pivot. There are many different versions of quickSort that pick pivot in different ways.
- Always pick first element as pivot.
- Always pick last element as pivot (implemented below)
- Pick a random element as pivot.
- Pick median as pivot.
Quick sort partitions an array and then calls itself recursively twice to sort the two resulting subarrays. This algorithm is quite efficient for large-sized data sets as its average and worst case complexity are of ?(n2), where n is the number of items.
Quick Sor Description
Quicksort, like merge sort, applies the divide-and-conquer paradigm. Here is the three-step divide-and-conquer process for sorting a typical subarray A[p.. r]
Divide: Partition (rearrange) the array A[p.. r] into two
(possibly empty) subarrays A[p, q-1] and A[q+1, r] such that each element
of A[p, q-1] is
less than or equal to A[q], which is, in turn, less than or
equal to each element of A[q+1, r]. Compute the index q as part of this partitioning procedure.
Conquer: Sort the two subarrays A[p, q-1] and A[q+1, r] by recursive calls to quicksort.
Combine: Because the subarrays are already sorted, no work is needed to combine them: the entire array A[p..r] is now sorted.
The following procedure implements quicksort:
Quicksort(A, p, r) 1 if p < r 2 Partition(A, p, r) 3 Quicksort(A, p, q-1 ) 4 Quicksort(A, q + 1, r)
Partitioning the array
The key to the algorithm is the PARTITION procedure,
which rearranges the subarray A[p.. r] in place.
Partition(A,p, r) 1 x = A[r] 2 i = p - 1 3 for j = p to r - 1 4 if A[j[ <= x 5 i = i + 1 6 exchange A[i] with A[j] 7 exchange A[i+1] with A[r] 8 return i + 1
Below is a example for Quick sort algorithm Steps
Below is a example for Quick sort algorithm Steps, where we always pick last element as pivot
Step 1:
Step 2:
Step 3:
Step 4:
Step 5:
Step 6:
Step 7:
Step 8:
Step 9:
Step 10:
Step 11:
Step 12:
Step 13:
Complexity Analysis of Quick Sort
For an array, in which partitioning leads to unbalanced subarrays, to an extent where on the left side there are no elements, with all the elements greater than the pivot, hence on the right side.
And if keep on getting unbalanced subarrays, then the running time is the worst case, which is O(n2)
Where as if partitioning leads to almost equal subarrays, then the running time is the best, with time complexity as O(n*log n).
Worst Case Time Complexity [ Big-O ]: O(n2)
Best Case Time Complexity [Big-omega]: O(n*log n)
Average Time Complexity [Big-theta]: O(n*log n)
Space Complexity: O(n*log n)
As we know now, that if subarrays partitioning produced after partitioning are unbalanced, quick sort will take more time to finish. If someone knows that you pick the last index as pivot all the time, they can intentionally provide you with array which will result in worst-case running time for quick sort.
To avoid this, you can pick random pivot element too. It won't make any difference in the algorithm, as all you need to do is, pick a random element from the array, swap it with element at the last index, make it the pivot and carry on with quick sort.
- Space required by quick sort is very less, only
O(n*log n)
additional space is required. - Quick sort is not a stable sorting technique, so it might change the occurence of two similar elements in the list while sorting.