Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
42 views93 pages

DSA8 Sorting 2

Uploaded by

Abhiram Abhiram
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views93 pages

DSA8 Sorting 2

Uploaded by

Abhiram Abhiram
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 93

Data Structures & Algorithms

Sorting
Sorting in
Data Structure & Algorithms

1) Bubble sort
2) Insertion sorts : Straight & Binary insertion of sort,
3) Selection sorts: Straight Selection Sort, Heap Sort; Merge sort;
4) Quick sort
5) Shell sort;
6) Exchange sorts,
7) Distribution Sorts : Bucket Sort, Radix Sort
Sorting
• A Sorting Algorithm is used to rearrange a given array or list of elements
according to a comparison operator on the elements.
• The comparison operator is used to decide the new order of elements in the
respective data structure.
• Sorting is the process of arranging the elements of an array so that they can
be placed either in ascending or descending order.
Sorting Terminology
Types of Sorting :
a) Internal Sorting
b) External Sorting

a) Internal Sorting:
• When all data is placed in the main memory or internal memory then sorting
is called internal sorting.
• In internal sorting, the problem cannot take input beyond its size.
➢ Example: heap sort, bubble sort, selection sort, quick sort, shell sort,
insertion sort.
Sorting Terminology
Types of Sorting :
a) Internal Sorting
b) External Sorting

b) External Sorting:
• When all data that needs to be sorted cannot be placed in memory at a time,
the sorting is called external sorting.
• External Sorting is used for the massive amount of data.
➢ Example: Merge sort, Tag sort, Polyphase sort, Four tape sort, External
radix sort, Internal merge sort, etc.
Sorting Terminology
What is stable sorting?
• When two same data appear in the same order in sorted data without
changing their position is called stable sort.
➢ Example: merge sort, insertion sort, bubble sort.

What is Unstable sorting?


• When two same data appear in the different order in sorted data it is called
unstable sort.
➢Example: quick sort, heap sort, shell sort.
Sorting Terminology
Stable sorting:

Unstable sorting:
Sorting
Stability of different sorting algorithm
Sorting Terminology
Complexity of Sorting Algorithms
The efficiency of any sorting algorithm is determined by the time complexity
and space complexity of the algorithm.

Time Complexity: Time complexity refers to the time taken by an algorithm to


complete its execution with respect to the size of the input.
It can be represented in different forms:
1) Big-O notation (O): Big-O notation represents the upper bound of the running time
of an algorithm. Thus, it gives the worst-case complexity of an algorithm.
2) Omega notation (Ω): Omega notation represents the lower bound of the running
time of an algorithm. Thus, it provides the best case complexity of an algorithm.
3) Theta notation (Θ): Theta notation encloses the function from above and below.
Since it represents the upper and the lower bound of the running time of an
algorithm, it is used for analyzing the average-case complexity of an algorithm.
Sorting Terminology
Complexity of Sorting Algorithms
The efficiency of any sorting algorithm is determined by the time complexity
and space complexity of the algorithm.

Space Complexity: Space complexity refers to the total amount of memory


used by the algorithm for a complete execution. It includes both the auxiliary
memory and the input.
• The auxiliary memory is the additional space occupied by the algorithm apart
from the input data.
Sorting
Complexity analysis of different sorting algorithms.
Sorting in
Data Structure & Algorithms

1) Bubble sort
2) Insertion sorts : Straight & Binary insertion of sort,
3) Selection sorts: Straight Selection Sort, Heap Sort; Merge sort;
4) Quick sort
5) Shell sort;
6) Exchange sorts,
7) Distribution Sorts : Bucket Sort, Radix Sort
Bubble Sorting
Bubble Sort
• Suppose we are given an array of integers and are asked to sort them using the
bubble sort algorithm, then it is not difficult to generate the resultant array, which
is just the sorted form of the given array.
• In fact, whichever algorithm you follow, the result would be the same.
Bubble Sorting
Bubble Sort
• Bubble sort intends to sort an array using (n-1) passes where n is the array's
length.
• And in one pass, the largest element of the current unsorted part reaches its final
position, and our unsorted part of the array reduces by 1, and the sorted part
increases by 1.

• At each pass, we will iterate through the unsorted part of the array and compare
every adjacent pair.
• We move ahead if the adjacent pair is sorted; otherwise, we make it sorted by
swapping their positions.
• And doing this at every pass ensures that the largest element of the unsorted part
of the array reaches its final position at the end.
Bubble Sorting
Bubble Sort

1st Pass: At first pass, our whole array comes under the unsorted part. We will
start by comparing each adjacent pair.

Since these two are already sorted, we move ahead without making
any changes.
Bubble Sorting
Bubble Sort

1st Pass:

Now since 9 is less than 11, we swap their positions to make


them sorted.

Again, we swap the positions of 11 and 2.


Bubble Sorting
Bubble Sort

1st Pass:

We move ahead without changing anything since they are


already sorted.

Here, we make a swap since 17 is greater than 4.


Bubble Sorting
Bubble Sort

1st Pass: And this is where our first pass finishes.

We should make an overview of what we received at the end of the


first pass.
Bubble Sorting
Bubble Sort

2nd Pass: We again start from the beginning, with a reduced unsorted part of
length 5.
Hence the number of comparisons would be just 4.

No changes to make.
Bubble Sorting
Bubble Sort

2nd Pass:

Yes, here we make a swap, since 9>2.


Bubble Sorting
Bubble Sort

2nd Pass:

Since 9 < 11, we move further.


Bubble Sorting
Bubble Sort

2nd Pass:

And since 11 is greater than 4, we make a swap again.

And that would be it for the second pass.


Bubble Sorting
Bubble Sort

2nd Pass:
Let’s see how close we have reached to the sorted array.
Bubble Sorting
Bubble Sort

3rd Pass: We’ll again start from the beginning, and this time our unsorted part
has a length of 4; hence no. of comparisons would be 3.

Since 7 is greater than 2, we make a swap here.


Bubble Sorting
Bubble Sort

3rd Pass:

We move ahead without making any change.


Bubble Sorting
Bubble Sort

3rd Pass:

In this final comparison, we make a swap, since 9 > 4.

And that was our third pass.


Bubble Sorting
Bubble Sort

4th Pass: We just have the unsorted part of length 3, and that would cause
just 2 comparisons.

No changes here.
Bubble Sorting
Bubble Sort

4th Pass:

We swap their positions. And that is all in the 4th pass.

The resultant array after the 4th pass is:


Bubble Sorting
Bubble Sort

5th Pass: We have only one comparison to make here.

And since these are already sorted, we finish our procedure here.

And see the final results:


Bubble Sorting
Bubble Sort

Time Complexity of Bubble Sort:


1. If you count the number of comparisons we made, there were (5+4+3+2+1), that is, a total
of 15 comparisons. And every time we compared, we had a fair probability of making a
swap. So, 15 comparisons intend to make 15 possible swaps. Let us quickly generalize this
sum. For length 6, we had 5+4+3+2+1 number of comparisons and possible swaps.
Therefore, for an array of length n, we would have (n-1) + (n-2) + (n-3) + (n-4) + . . . . . + 1
comparison and possible swaps.
Bubble Sorting
Bubble Sort

Time Complexity of Bubble Sort:


2. This is a high school thing to find the sum from 1 to n-1, which is n(n-1)/2, and hence our
complexity of runtime becomes O(n^2).
Bubble Sorting
Bubble Sort

Time Complexity of Bubble Sort:


3. And if you could observe, we never made a swap when two elements of a pair become
equal. Hence the algorithm is a stable algorithm.

4. It is not a recursive algorithm since we didn’t use recursion here.

5. This algorithm has no adaptive aspect since every pair will be compared, even if the array
given has already been sorted. So, no adaptiveness. Although it can be modified to make it
adaptive, it's not adaptive by default.
Bubble Sorting

Bubble Sort Algorithm:

1. Run a nested for loop to traverse the input array using two variables i and j, such
that 0 ≤ i < n-1 and 0 ≤ j < n-i-1

2. If arr[j] is greater than arr[j+1] then swap these adjacent elements, else move on

3. Print the sorted array


Bubble Sorting

Bubble Sort Algorithm:


Sorting in
Data Structure & Algorithms

1) Bubble sort
2) Insertion sorts : Straight & Binary insertion of sort,
3) Selection sorts: Straight Selection Sort, Heap Sort; Merge sort;
4) Quick sort
5) Shell sort;
6) Exchange sorts,
7) Distribution Sorts : Bucket Sort, Radix Sort
Insertion Sorting
Insertion sort is a simple sorting algorithm that works similar to the way you sort
playing cards in your hands.

➢The array is virtually split into a sorted and an unsorted part. Values from the unsorted
part are picked and placed at the correct position in the sorted part.
Insertion Sorting
Characteristics of Insertion Sort:

➢This algorithm is one of the simplest algorithm with simple implementation


➢Basically, Insertion sort is efficient for small data values
➢Insertion sort is adaptive in nature, i.e. it is appropriate for data sets which are
already partially sorted.
Insertion Sorting
Working of Insertion Sort:

Example: arr[]: {12, 11, 13, 5, 6}

1st Pass: Initially, the first two elements of the array are compared in insertion sort.

➢Here, 12 is greater than 11 hence they are not in the ascending order
and 12 is not at its correct position.
➢Thus, swap 11 and 12.
➢So, for now 11 is stored in a sorted sub-array.
Insertion Sorting
Working of Insertion Sort :

Example: arr[]: {12, 11, 13, 5, 6}

2nd Pass: Now, move to the next two elements and compare them

➢Here, 13 is greater than 12, thus both elements seems to be in ascending


order, hence, no swapping will occur.
➢12 also stored in a sorted sub-array along with 11
Insertion Sorting
Working of Insertion Sort :

Example: arr[]: {12, 11, 13, 5, 6}

3rd Pass: Now, two elements are present in the sorted sub-array which are 11 and 12
Moving forward to the next two elements which are 13 and 5

Both 5 and 13 are not present at their correct place so swap them
Insertion Sorting
Working of Insertion Sort:

Example: arr[]: {12, 11, 13, 5, 6}

3rd Pass: After swapping, elements 12 and 5 are not sorted, thus swap again

Here, again 11 and 5 are not sorted, hence swap again

here, it is at its correct position


Insertion Sorting
Working of Insertion Sort:

Example: arr[]: {12, 11, 13, 5, 6}

4th Pass: Now, the elements which are present in the sorted sub-array are 5, 11 and 12

Moving to the next two elements 13 and 6

Clearly, they are not sorted, thus perform swap between both
Insertion Sorting
Working of Insertion Sort:

Example: arr[]: {12, 11, 13, 5, 6}

4th Pass: Now, 6 is smaller than 12, hence, swap again

Here, also swapping makes 11 and 6 unsorted hence, swap again

Finally, the array is completely sorted.


Insertion Sorting
Working of Insertion Sort:
Insertion Sorting
Insertion Sort algorithm:

➢To sort an array of size N in ascending order:


➢Iterate from arr[1] to arr[N] over the array.
➢Compare the current element (key) to its predecessor.
➢If the key element is smaller than its predecessor, compare it to the elements
before. Move the greater elements one position up to make space for the swapped
element.
Insertion Sorting
Insertion Sort Program:
Insertion Sorting
Analysis of Insertion Sort algorithm:
1) Time Complexity of Insertion Sort Algorithm:
an array of length n, the total number of comparison/possible swaps would be 1+2+3+4+ .
. . + (n-1) which is n(n-1)/2, which ultimately is O(n2).

2) Insertion sort algorithm is a stable algorithm, since we start comparing from the back
of the sorted sub-array, and never cross an element equal to the to be inserted element.

3) Insertion sort algorithm is an adaptive algorithm. When our array is already sorted, we
just make (n-1) passes, and don’t make any actual comparison between the elements.
Hence, we accomplish the job in O(n).
Sorting in
Data Structure & Algorithms

1) Bubble sort
2) Insertion sorts : Straight & Binary insertion of sort,
3) Selection sorts: Straight Selection Sort, Heap Sort; Merge sort;
4) Quick sort
5) Shell sort;
6) Exchange sorts,
7) Distribution Sorts : Bucket Sort, Radix Sort
Selection Sorting
Selection Sorting
Selection Sorting
Selection Sorting
Selection Sorting
1st Pass: ➢At first pass, our whole array comes under the unsorted part. We will
start by assuming 0 as the min index.
➢Now, we’ll have to check among the remaining 4 elements if there is
still a lesser element than the first one.
Selection Sorting
1st Pass: ➢And when we compared the element at min index with the element at index
1, we found that 0 is less than 8 and hence we update our min index to 1.
Selection Sorting
1st Pass: ➢And now we keep checking with the updated min. Since 7 is not less than 0,
we move ahead.
Selection Sorting
1st Pass: ➢And now we compared the elements at index 1 and 3, and 0 is still lesser
than 1, so we move ahead without making any changes.
Selection Sorting
1st Pass: ➢And now we compared the element at the min index with the last element.
Since there is nothing to change, we end our 1st pass here.
➢Now we simply replace the element at 0th index with the element at
the min index. And this gives us our first sorted sub-array of size 1.
➢And this is where our first pass finishes.
Selection Sorting
2nd Pass: ➢We now start from the beginning of the unsorted array, with a reduced
unsorted part of length 4.
➢Hence the number of comparisons would be just 3.
➢We assume the element at index 1 is the one at the min index and start
iterating to the right for finding the minimum element.
Selection Sorting
2nd Pass: ➢Since 7 is less than 8, we update our min index with 2. And move further.
Selection Sorting
2nd Pass: ➢Next, we compared the elements 7 and 1, and since 1 is still lesser than 7,
we update the min index by 3. Then, we move ahead to the next comparison.
Selection Sorting
2nd Pass: ➢And since 3 is greater than 1, we don’t make any changes here.
➢And since we are finished with the array, we stop our pass here itself, and
swap the element at index 1 with this element at min index.
➢And that would be it for the second pass.
Selection Sorting
3rd Pass: ➢We’ll again start from the beginning of the unsorted sub-array which is
from the index 2, and make the min index equal to 2 for now.
➢And this time our unsorted part has a length 3, hence no. of comparisons
would be 2.
Selection Sorting
3rd Pass: ➢Since 8 is greater than 7, we would make no change, but move ahead.

Comparing the elements at index min and 4, we found 3 to be smaller than 7


and hence an update is needed here. So, we update min to 4.
Selection Sorting
3rd Pass: ➢And since that was the last comparison of the third pass, we make a swap of
the indices 2 and min. And the result at the end would be:
Selection Sorting
➢We now have the sorted sub-array of length 3, hence the new min would be
4th Pass:
at the index 3.
➢And for the unsorted part of length 2, we would make just a single
comparison.

➢And since 7 is less than 8, we update our min to 4.


➢And since that was the only comparison in this pass, we finish our
procedure here by swapping the elements at the indices min and 3.
➢And the final results:
Selection Sorting
4th Pass: ➢ And since a sub-array with a single element is always sorted,
➢ we ignore the only unsorted part and make it sorted too.

NOTE: We select the minimum element at each pass and give it its final
position. This is why the Selection Sort algorithm got its name.
Selection Sorting
Analysis of Selection Sort Algorithm:

1) Time Complexity of Selection Sort: for an array of length n, we would have (n-1) +
(n-2) + (n-3) + (n-4) + . . . . . + 1 comparisons.

Sum from 1 to n-1, we get , and hence the time complexity of the algorithm would
be O(n2).
Selection Sorting
Analysis of Selection Sort Algorithm:

2) Selection sort algorithm is not a stable algorithm. Since the smallest element is
replaced with the first element at each pass, it may jumble up positions of equal
elements very easily. Hence, unstable. Refer to the example below
Selection Sorting
Analysis of Selection Sort Algorithm:

3) Selection sort is not a recursive algorithm

4) Selection sort would anyways compare every element with the min element,
regardless of the fact if the array is sorted or not, hence selection sort is not an adaptive
algorithm by default.
Selection Sorting
Selection Sort Algorithm Steps:

1) Initialize minimum value(min_idx) to location 0.


2) Traverse the array to find the minimum element in the array.
3) While traversing if any element smaller than min_idx is found then swap both the
values.
4) Then, increment min_idx to point to the next element.
5) Repeat until the array is sorted.
Selection Sorting
Selection Sort Algorithm Program:
Sorting in
Data Structure & Algorithms

1) Bubble sort
2) Insertion sorts : Straight & Binary insertion of sort
3) Selection sorts: Straight Selection Sort
4) Merge sort
5) Quick sort
6) Heap Sort
7) Shell sort;
8) Exchange sorts,
9) Distribution Sorts : Bucket Sort, Radix Sort
Merge Sort

The Merge Sort algorithm is a sorting algorithm that is based on the Divide and
Conquer paradigm.

➢In this algorithm, the array is initially divided into two equal halves and then they are
combined in a sorted manner.

➢why we call it the merge sort algorithm.


In this algorithm, we divide the arrays into subarrays and subarrays into more
subarrays until the size of each subarray becomes 1. Since arrays with a single
element are always considered sorted, this is where we merge. Merging two sorted
subarrays creates another sorted subarray.
Merge Sort
Merging Procedure:

Suppose we have two sorted arrays, A and B, of sizes 5 and 4, respectively.

1) We apply merging on them. Then the first job would be to create another array C with
size being the sum of both the raw arrays’ sizes.

➢ Here the sizes of A and B are 5 and 4, respectively. So, the size of array C would be 9.
Merge Sort
2) We maintain three index variables i, j, and k.
➢ ‘i’ looks after the first element in array A,
➢ ‘j’ looks after the first element in array B,
➢ and ‘k’ looks after the position in array C to place the next element in.
Merge Sort
3) Initially, all i, j, and k are equal to 0.

4) Now, we compare the elements at index i of array A and index j of array B and see
which one is smaller. Fill in the smaller element at index k of array C and increment k by
1. Also, increment the index variable of the array we fetched the smaller element from.

5) Here, A[i] is greater than B[j]. Hence we fill C[k] with B[j] and increase k and j by 1.
Merge Sort
6) We continue doing step 5 until one of the arrays, A or B, gets empty.
Merge Sort
➢ Here, array B inserted all its elements in the merged array C.
➢ Since we are only left with the elements of element A, we simply put them in the
merged array as they are.
➢ This will result in our final merged array C.
Merge Sort
➢ Programming the merging of two sorted array
Merge Sort
➢ Programming the merging of two sorted array
Merge Sort
➢ Programming the merging of two sorted array
Merge Sort
➢ Suppose there is an array A of 5 elements and contains two sorted subarrays of
length 2 and 3 in itself.
Merge Sort
➢ To merge both the sorted subarrays and produce a sorted array of length 5, we will
create an auxiliary array B of size 5.
➢ Now the process would be more or less the same, and the only change we would need
to make is to consider the first index of the first subarray as low and the last index of
the second subarray as high.
➢ And mark the index prior to the first index of the second subarray as mid.
Merge Sort
➢ Previously we had index variables i, j, and k initialized with 0 of their respective
arrays.
➢ But here, i gets initialized with low, j gets initialized with mid+1, and k gets
initialized with low only.
➢ i runs from low to mid, j runs from mid+1 to high, and until and unless they both get
all their elements merged, we continue filling elements in array B.
Merge Sort
Merge Sort
Merge Sort
Merge Sort
Whenever you receive an unsorted array, you break the array into fragments till the
size of each subarray becomes 1.
Merge Sort
➢ we divided the array until there are all subarrays of just length 1.
➢ Since any array/subarray of length 1 is always sorted, we just need to merge all these
singly-sized subarrays into a single entity.
Merge Sort
➢ To achieve this divided merging and sorting, we create a recursive function merge
sort and pass our array and the low and high index into it.

➢ This function divides the array into two parts: one from low to mid and another
from mid+1 to high.

➢ Then, it recursively calls itself passing these divided subarrays. And the resultant
subarrays are sorted.

➢ In the next step, it just merges them.


Merge Sort
Merge Sort
Thank You

You might also like