Design & Analysis of Algorithms
Time complexities of Quick sort, Merge sort
D Bheekya
Quick sort
• A best/good case
○ It occurs only if each partition divides the list into two
equal size sublists.
○ T(n)=2T(n/2)+cn =O(nlog n)
O(n logn)
Best/good Case
n/2 n/2
n/4 n/4 n/4 n/4
2 2 2
• Total time: O(nlogn)
Best Case Time Complexity
Time complexity analysis
● A worst/bad case
○ T(n)=T(n-1)+cn=O(n2)
1 2 3 4 5 6 7 8 9
1 2 3 4 5 6 7 8 9
2 3 4 5 6 7 8 9
3 4 5 6 7 8 9
4 5 6 7 8 9
5 6 7 8 9 O(n2)
6 7 8 9
7 8 9
8 9
9
Worst/bad Case
n cn
n-1 c(n-1)
n-2 c(n-2)
3c
3
Happens only if
• input is sortd 2 2c
• input is reversely sorted 1c
1
Total time: O(n2)
Worst-Case Analysis:
• The pivot is the smallest element, all the time. Then i = 0, and if
we ignore T(0) = 1, which is insignificant, the recurrence is T(N) =
T(N − 1) + cN, N > 1
• We telescope, using above equation repeatedly. Thus,
T(N − 1) = T(N − 2) + c(N − 1)
T(N − 2) = T(N − 3) + c(N − 2)
...
T(2) = T(1) + c(2)
Adding up all these equations yields
T(N) = T(1) + c((N+1)(N/2)-1) =O(N^2) as claimed earlier.
• To see that this is the worst possible case, note that the total cost
of all the partitions in recursive calls at depth d must be at most
N. Since the recursion depth is at most N, this gives an O(N^2)
worst-case bound for quicksort.
Merge sort Analysis
If the time for the merging operations is proportional to ‘n’, then the
computing time for merge sort is described by the recurrence
relation.
T(n) = a n=1,’a’ a constant
2T(n/2)+cn n>1, ‘c’ a constant.
Where,
•Let T(N) be the running time for an array of N elements
•Merge sort divides array in half and calls itself on the two halves.
After returning, it merges both halves using a temporary array
•Each recursive call takes T(N/2) and merging takes cn
Summary
• Quick-Sort
– Most of the work done in partitioning
– Best case takes O(n log(n)) time
– Average case takes O(n log(n)) time
– Worst case takes O(n2) time
• Merge-sort
All cases take O(n log(n)) time