Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
20 views138 pages

Demo 2 Three Sorting

Uploaded by

Sneha Malik
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views138 pages

Demo 2 Three Sorting

Uploaded by

Sneha Malik
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 138

Different types of Sorting

Techniques used in Data


Structures
By:
Vishal Kumar Arora
AP,CSE Department,
Shaheed Bhagat Singh State Technical Campus,
Ferozepur.
Sorting: Definition

Sorting: an operation that segregates


items into groups according to specified
criterion.

A={3162134590}

A={0112334569}
Sorting
 Sorting = ordering.
 Sorted = ordered based on a particular
way.
 Generally, collections of data are presented
in a sorted manner.
 Examples of Sorting:
 Words in a dictionary are sorted (and
case distinctions are ignored).
 Files in a directory are often listed in
sorted order.
 The index of a book is sorted (and case
distinctions are ignored).
Sorting: Cont’d
 Many banks provide statements
that list checks in increasing order
(by check number).
 In a newspaper, the calendar of events
in a schedule is generally sorted by date.
 Musical compact disks in a record store
are generally sorted by recording artist.
 Why?
 Imagine finding the phone number of
your friend in your mobile phone, but the
phone book is not sorted.
Review of Complexity
Most of the primary sorting algorithms
run on different space and time
complexity.

Time Complexity is defined to be the time


the computer takes to run a program (or
algorithm in our case).

Space complexity is defined to be the


amount of memory the computer needs
to run a program.
Complexity (cont.)
Complexity in general, measures the
algorithms efficiency in internal factors
such as the time needed to run an
algorithm.

External Factors (not related to


complexity):
Size of the input of the algorithm

Speed of the Computer

Quality of the Compiler


O(n), Ω(n), & Θ(n)


An algorithm or function T(n) is O(f(n)) whenever
T(n)'s rate of growth is less than or equal to f(n)'s
rate.

An algorithm or function T(n) is Ω(f(n)) whenever
T(n)'s rate of growth is greater than or equal to
f(n)'s rate.

An algorithm or function T(n) is Θ(f(n)) if and
only if the rate of growth of T(n) is equal to f(n).
Types of Sorting Algorithms

There are many, many different types of


sorting algorithms, but the primary ones
are:

Bubble Sort ●
Radix Sort

Selection Sort ●
Swap Sort

Insertion Sort ●
Heap Sort

Merge Sort

Quick Sort

Shell Sort
Bubble Sort: Idea
 Idea: bubble in water.
 Bubble in water moves upward. Why?
 How?
 When a bubble moves upward, the water
from above will move downward to fill in
the space left by the bubble.
Bubble Sort Example
9, 6, 2, 12, 11, 9, 3, 7
6,
Bubblesort
exchanging 9,
Bubblesortcompares
when 2,
comparesthe
necessary. 12,
thenumbers
numbersin
Here the11, 9, 3, 7
inpairs
pairsfrom
first
fromleft
leftto
number
exchanging when necessary. Here the first number is compared
to
is
toright
right
compared
tothe
thesecond
secondand
andasasititisislarger
largerthey
theyare
areexchanged.
exchanged.
Now
6,
Nowthe
larger and 2,
thenext
nextpair
so
pairof
this 9,
ofnumbers
pair is 12,
numbersare
also 11, 9, 3, 7
arecompared.
compared. Again
exchanged.
larger and so this pair is also exchanged.
Againthe
the99isisthe
the

In
6,
Inthe
thethird
exchange 2,
is made. 9,
thirdcomparison,
comparison,the
We 12,
the99isisnot
move on to 11, 9, 3, 7
notlarger
largerthan
compare
thanthe
the
the12
next
12so
exchange is made. We move on to compare the next pair without
any
sono
pair
no
without
anychange
changeto tothe
thelist.
list.
The
6,
The12 2,
12isislarger
largerthan9,
thanthe
the11
11so11,
sothey
theyare 12, 9, 3, 7
areexchanged.
exchanged.

The
Theend
The

endof
6,
Thetwelve
ofthe
2,
twelveisisgreater
thelist
listhas
9,
greaterthan
hasbeen
thanthe
beenreached
11,
the99so
reachedso
sothey
theyare
sothis
9, 12, 3, 7
areexchanged
exchanged
thisisisthe
theend
endof ofthe
thefirst
firstpass.
pass. The
The

6, 2, 9, 11, 9, 3, 12, 7
twelve
twelveatatthe
theend
endof ofthe
thelist
listmust
mustbe belargest
largestnumber
numberin inthe
thelist
listand
andsosoisisnow
nowinin
the The
correct 12 is greater
position.
The 12 We
is greater than
now the 3
start so
a they
new are
pass exchanged.
from left to right.
the correct position. Wethan
now the 3 so
start theypass
a new are exchanged.
from left to right.

The
6,
The12 2,
12isisgreater 9,
greaterthan
thanthe
the77so11,
sothey
theyare 9, 3, 7, 12
areexchanged.
exchanged.
Bubble Sort Example
First Pass
6, 2, 9, 11, 9, 3, 7, 12
Second Pass
6, 6,
2, 2, 9, 9,
11,3,
11,
9,7,
11,
3,11,
7, 12
Notice
Noticethat
thatthis
thistime
timewe
wedodonot
nothave
havetotocompare
comparethe
thelast
lasttwo
two
numbers
numbers as we know the 12 is in position. This pass thereforeonly
as we know the 12 is in position. This pass therefore only
requires 6 comparisons.
requires 6 comparisons.
Bubble Sort Example
First Pass
6, 2, 9, 11, 9, 3, 7, 12
Second Pass

Third Pass
2, 6, 9, 9, 3, 7, 11, 12
2, 6, 9, 3,
9, 9,
7, 9,
3, 7, 11, 12
This
Thistime
timethe
the11
11and
and12
12are
arein
inposition.
position. This
Thispass
passtherefore
thereforeonly
only
requires
requires55comparisons.
comparisons.
Bubble Sort Example
First Pass
6, 2, 9, 11, 9, 3, 7, 12
Second Pass

Third Pass
2, 6, 9, 9, 3, 7, 11, 12
Fourth Pass
2, 6, 9, 3, 7, 9, 11, 12
2, 6, 3,
9, 9,
7, 9,
3, 7, 9, 11, 12
Each
Eachpass
passrequires
requiresfewer
fewercomparisons.
comparisons. This
Thistime
timeonly
only44are
areneeded.
needed.
Bubble Sort Example
First Pass
6, 2, 9, 11, 9, 3, 7, 12
Second Pass

Third Pass
2, 6, 9, 9, 3, 7, 11, 12
Fourth Pass
2, 6, 9, 3, 7, 9, 11, 12
Fifth Pass
2, 6, 3, 7, 9, 9, 11, 12
2, 6,
3, 3,
6, 7, 9, 9, 11, 12
The
Thelist
listisisnow
nowsorted
sortedbut
butthe
thealgorithm
algorithmdoes
doesnot
notknow
knowthis
thisuntil
untilitit
completes
completesaapasspasswith
withno
noexchanges.
exchanges.
Bubble Sort Example
First Pass
6, 2, 9, 11, 9, 3, 7, 12
Second Pass

Third Pass
2, 6, 9, 9, 3, 7, 11, 12
Fourth Pass
2, 6, 9, 3, 7, 9, 11, 12
Fifth Pass
2,
This
Thispass
6,
passno
3,
noexchanges
7,
exchangesarearemade
9,
madesosothe
9,
thealgorithm
11,
algorithmknows
knowsthe
12
thelist
listisis
sorted.
sorted. ItItcan
cantherefore
thereforesave
savetime
timeby
bynot
notdoing
doingthe
thefinal
finalpass.
pass. With
2, 3, 6, 7, 9, 9, 11, 12
With
other lists this check could save much more work.
other lists this check could save much more work.
Sixth Pass

2, 3, 6, 7, 9, 9, 11, 12
Bubble Sort Example
Quiz Time
1. Which number is definitely in its correct position at the
end of the first pass?
Answer: The last number must be the largest.
2. How does the number of comparisons required change as
the pass number increases?
Answer: Each pass requires one fewer comparison than the last.
3. How does the algorithm know when the list is sorted?
Answer: When a pass with no exchanges occurs.
4. What is the maximum number of comparisons required
for a list of 10 numbers?
Answer: 9 comparisons, then 8, 7, 6, 5, 4, 3, 2, 1 so total 45
Bubble Sort: Example

1 40 2 1 43 3 65 0 -1 58 3 42 4

2 2 1 40 3 43 0 -1 58 3 42 4 65

3 1 2 3 40 0 -1 43 3 42 4 58 65

4 1 2 3 0 -1 40 3 42 4 43 58 65

 Notice that at least one element will be


in the correct position each iteration.
Bubble Sort: Example
5 1 2 0 -1 3 3 40 4 42 43 58 65

6 1 0 -1 2 3 3 4 40 42 43 58 65

7 0 -1 1 2 3 3 4 40 42 43 58 65

8 -1 0 1 2 3 3 4 40 42 43 58 65
Bubble Sort: Analysis
 Running time:
 Worst case: O(N2)
 Best case: O(N)
 Variant:
 bi-directional bubble sort
 original bubble sort: only works to one
direction
 bi-directional bubble sort: works back and
forth.
Selection Sort: Idea
1. We have two group of items:
 sorted group, and
 unsorted group
2. Initially, all items are in the unsorted
group. The sorted group is empty.
 We assume that items in the unsorted
group unsorted.
 We have to keep items in the sorted
group sorted.
Selection Sort: Cont’d
1. Select the “best” (eg. smallest) item
from the unsorted group, then put
the “best” item at the end of the
sorted group.
2. Repeat the process until the unsorted
group becomes empty.
Selection Sort

5 1 3 4 6 2

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 6 2

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 6 2

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 6 2

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 6 2

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 6 2

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 6 2

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 6 2

Largest

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 2 6

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 2 6

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 2 6

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 2 6

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 2 6

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 2 6

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 2 6

Comparison

Data Movement

Sorted
Selection Sort

5 1 3 4 2 6

Largest

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Largest

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Largest

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

2 1 3 4 5 6

Largest

Comparison

Data Movement

Sorted
Selection Sort

1 2 3 4 5 6

Comparison

Data Movement

Sorted
Selection Sort

1 2 3 4 5 6
DONE!
Comparison

Data Movement

Sorted
Selection Sort: Example
40 2 1 43 3 65 0 -1 58 3 42 4

40 2 1 43 3 4 0 -1 58 3 42 65

40 2 1 43 3 4 0 -1 42 3 58 65

40 2 1 3 3 4 0 -1 42 43 58 65
Selection Sort: Example

40 2 1 3 3 4 0 -1 42 43 58 65

-1 2 1 3 3 4 0 40 42 43 58 65

-1 2 1 3 3 0 4 40 42 43 58 65

-1 2 1 0 3 3 4 40 42 43 58 65
Selection Sort: Example

-1 2 1 0 3 3 4 40 42 43 58 65

-1 0 1 2 3 3 4 40 42 43 58 65

-1 0 1 2 3 3 4 40 42 43 58 65

-1 0 1 2 3 3 4 40 42 43 58 65

-1 0 1 2 3 3 4 40 42 43 58 65
Selection Sort: Analysis
 Running time:
 Worst case: O(N2)
 Best case: O(N2)
Insertion Sort: Idea
 Idea: sorting cards.
 8 | 5 9 2 6 3
 5 8 | 9 2 6 3
 5 8 9 | 2 6 3
 2 5 8 9 | 6 3
 2 5 6 8 9 | 3
 2 3 5 6 8 9 |
Insertion Sort: Idea
1. We have two group of items:
 sorted group, and
 unsorted group
2. Initially, all items in the unsorted group and the
sorted group is empty.
 We assume that items in the unsorted group unsorted.
 We have to keep items in the sorted group sorted.
3. Pick any item from, then insert the item at the right
position in the sorted group to maintain sorted
property.
4. Repeat the process until the unsorted group becomes
empty.
Insertion Sort: Example

40 2 1 43 3 65 0 -1 58 3 42 4

2 40 1 43 3 65 0 -1 58 3 42 4

1 2 40 43 3 65 0 -1 58 3 42 4
Insertion Sort: Example

1 2 40 43 3 65 0 -1 58 3 42 4

1 2 3 40 43 65 0 -1 58 3 42 4

1 2 3 40 43 65 0 -1 58 3 42 4
Insertion Sort: Example

1 2 3 40 43 65 0 -1 58 3 42 4

0 1 2 3 40 43 65 -1 58 3 42 4

-1
0 1
0 2
1 3
2 40
3 40
43 43
65 65 58 3 42 4
Insertion Sort: Example
-1
0 1
0 2
1 3
2 40
3 40
43 43
65 58 65 3 42 4

-1
0 1
0 2
1 3
2 40
3 43
3 40
65 43
43 58 58
65 65 42 4

-1
0 1
0 2
1 3
2 40
3 43
3 40
65 42 43
43 65 58 65 4

-1
0 1
0 2
1 3
2 40
3 43
3 43
65
4 40
42 42
65 43
43 58 58
65 65
Insertion Sort: Analysis
 Running time analysis:
 Worst case: O(N2)
 Best case: O(N)
A Lower Bound
 Bubble Sort, Selection Sort, Insertion
Sort all have worst case of O(N2).
 Turns out, for any algorithm that
exchanges adjacent items, this is the
best worst case: Ω(N2)
 In other words, this is a lower bound!
Mergesort
Mergesort (divide-and-conquer)
 Divide array into two halves.

A L G O R I T H M S

A L G O R I T H M S divide
Mergesort
Mergesort (divide-and-conquer)
 Divide array into two halves.
 Recursively sort each half.

A L G O R I T H M S

A L G O R I T H M S divide

A G L O R H I M S T sort
Mergesort
Mergesort (divide-and-conquer)
 Divide array into two halves.
 Recursively sort each half.

 Merge two halves to make sorted whole.

A L G O R I T H M S

A L G O R I T H M S divide

A G L O R H I M S T sort

A G H I L M O R S T merge
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.

smallest smallest

A G L O R H I M S T

A auxiliary array
Merging
Merge.

 Keep track of smallest element in each sorted half.


 Insert smallest of two elements into auxiliary array.
 Repeat until done.

smallest smallest

A G L O R H I M S T

A G auxiliary array
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.

smallest smallest

A G L O R H I M S T

A G H auxiliary array
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.

smallest smallest

A G L O R H I M S T

A G H I auxiliary array
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.

smallest smallest

A G L O R H I M S T

A G H I L auxiliary array
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.

smallest smallest

A G L O R H I M S T

A G H I L M auxiliary array
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.

smallest smallest

A G L O R H I M S T

A G H I L M O auxiliary array
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.

smallest smallest

A G L O R H I M S T

A G H I L M O R auxiliary array
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.
first half
exhausted smallest

A G L O R H I M S T

A G H I L M O R S auxiliary array
Merging
Merge.
 Keep track of smallest element in each sorted half.
 Insert smallest of two elements into auxiliary array.
 Repeat until done.
first half
exhausted smallest

A G L O R H I M S T

A G H I L M O R S T auxiliary array
Notes on Quicksort
 Quicksort is more widely used than any
other sort.
 Quicksort is well-studied, not difficult to
implement, works well on a variety of
data, and consumes fewer resources
that other sorts in nearly all situations.
 Quicksort is O(n*log n) time, and O(log
n) additional space due to recursion.
Quicksort Algorithm
 Quicksort is a divide-and-conquer method for
sorting. It works by partitioning an array
into parts, then sorting each part
independently.
 The crux of the problem is how to partition
the array such that the following conditions
are true:
 There is some element, a[i], where a[i] is
in its final position.
 For all l < i, a[l] < a[i].
 For all i < r, a[i] < a[r].
Quicksort Algorithm (cont)
 As is typical with a recursive program, once you figure
out how to divide your problem into smaller
subproblems, the implementation is amazingly simple.
int partition(Item a[], int l, int r);
void quicksort(Item a[], int l, int r)
{ int i;
if (r <= l) return;
i = partition(a, l, r);
quicksort(a, l, i-1);
quicksort(a, i+1, r);
}
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
 repeat until pointers cross

Q U I C K S O R T I S C O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
swap me  repeat until pointers cross

Q U I C K S O R T I S C O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
swap me  repeat until pointers cross

Q U I C K S O R T I S C O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
swap me  repeat until pointers cross

Q U I C K S O R T I S C O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
swap me  repeat until pointers cross swap me

Q U I C K S O R T I S C O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
 repeat until pointers cross

C U I C K S O R T I S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
choose partition element to be rightmost element

 scan from left for larger element

 scan from right for smaller element

 exchange

 repeat until pointers cross


swap me

C U I C K S O R T I S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
choose partition element to be rightmost element

 scan from left for larger element

 scan from right for smaller element

 exchange

 repeat until pointers cross


swap me

C U I C K S O R T I S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
choose partition element to be rightmost element

 scan from left for larger element

 scan from right for smaller element

 exchange

 repeat until pointers cross


swap me swap me

C U I C K S O R T I S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
 repeat until pointers cross

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
 repeat until pointers cross

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 exchange
 repeat until pointers cross

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 Exchange and repeat until pointers cross

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 Exchange and repeat until pointers cross
swap me

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 Exchange and repeat until pointers cross
swap me

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 Exchange and repeat until pointers cross
swap me

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 Exchange and repeat until pointers cross
swap me

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 Exchange and repeat until pointers crossswap with
pointers cross partitioning
element

C I I C K S O R T U S Q O O L

partition element unpartitioned left

partitioned right
Partitioning in Quicksort
 How do we partition the array efficiently?
 choose partition element to be rightmost element
 scan from left for larger element
 scan from right for smaller element
 Exchange and repeat until pointers cross
partition is
complete

C I I C K L O R T U S Q O O S

partition element unpartitioned left

partitioned right
Quicksort Demo
 Quicksort illustrates the operation of the
basic algorithm. When the array is
partitioned, one element is in place on the
diagonal, the left subarray has its upper
corner at that element, and the right
subarray has its lower corner at that
element. The original file is divided into two
smaller parts that are sorted independently.
The left subarray is always sorted first, so the
sorted result emerges as a line of black dots
moving right and up the diagonal.
Why study Heapsort?
 It is a well-known, traditional
sorting algorithm you will be
expected to know
 Heapsort is always O(n log n)
 Quicksort is usually O(n log n) but in
the worst case slows to O(n2)
 Quicksort is generally faster, but
Heapsort is better in time-critical
applications
What is a “heap”?
 Definitions of heap:
1. A large area of memory from which
the programmer can allocate blocks as
needed, and deallocate them (or allow
them to be garbage collected) when
no longer needed
2. A balanced, left-justified binary tree in
which no node has a value greater
than the value in its parent
 Heapsort uses the second definition
Balanced binary trees
 Recall:
 The depth of a node is its distance from the root
 The depth of a tree is the depth of the deepest node
 A binary tree of depth n is balanced if all the nodes at depths 0 through
n-2 have two children

n-2
n-1
n
Balanced Balanced Not balanced
Left-justified binary trees
 A balanced binary tree is left-justified if:
 all the leaves are at the same depth, or
 all the leaves at depth n+1 are to the left of
all the nodes at depth n

Left-justified Not left-justified


The heap property
 A node has the heap property if the
value in the node is as large as or
larger than the values in its children
12 12 12

8 3 8 12 8 14
Blue node has Blue node has Blue node does not
heap property heap property have heap property
 All leaf nodes automatically have the heap
property
 A binary tree is a heap if all nodes in it have the
heap property
siftUp
 Given a node that does not have the heap
property, you can give it the heap property
by exchanging its value with the value of the
larger child
14
12
8 12
8 14 Blue node has
Blue node does not heap property
have heap property
 This is sometimes called sifting up
 Notice that the child may have lost the heap
property
Constructing a heap I
 A tree consisting of a single node is automatically a heap
 We construct a heap by adding nodes one at a time:
 Add the node just to the right of the rightmost node in the
deepest level
 If the deepest level is full, start a new level
 Examples:

Add a new Add a new


node here node here
Constructing a heap II
 Each time we add a node, we may destroy the heap property of its
parent node
 To fix this, we sift up
 But each time we sift up, the value of the topmost node in the sift may
increase, and this may destroy the heap property of its parent node
 We repeat the sifting up process, moving up in the tree, until either
 We reach nodes whose values don’t need to be swapped (because
the parent is still larger than both children), or
 We reach the root
Constructing a heap III
8 8 10 10

10 8 8 5

1 2 3

10 10 12

8 5 12 5 10 5

12 8 8
4
Other children are not affected
12 12 14

10 5 14 5 12 5

8 14 8 10 8 10

 The node containing 8 is not affected because its parent gets


larger, not smaller
 The node containing 5 is not affected because its parent gets
larger, not smaller
 The node containing 8 is still not affected because, although its
parent got smaller, its parent is still greater than it was originally
A sample heap
 Here’s a sample binary tree after it has been
heapified
25

22 17

19 22 14 15

18 14 21 3 9 11

 Notice that heapified does not mean sorted


 Heapifying does not change the shape of the
binary tree; this binary tree is balanced and
left-justified because it started out that way
Removing the root
 Notice that the largest number is now in the root
 Suppose we discard the root:
11

22 17

19 22 14 15

18 14 21 3 9 11

 How can we fix the binary tree so it is once


again balanced and left-justified?
 Solution: remove the rightmost leaf at the
deepest level and use it for the new root
The

reHeap method I
Our tree is balanced and left-justified, but no longer a
heap
 However, only the root lacks the heap property
11

22 17

19 22 14 15

18 14 21 3 9

 We can siftUp() the root


 After doing this, one and only one of its
children may have lost the heap property
The reHeap method II
 Now the left child of the root (still the
number 11) lacks the heap property
22

11 17

19 22 14 15

18 14 21 3 9

 We can siftUp() this node


 After doing this, one and only one of its
children may have lost the heap property
The reHeap method III
 Now the right child of the left child of the root
(still the number 11) lacks the heap property:
22

22 17

19 11 14 15

18 14 21 3 9

 We can siftUp() this node


 After doing this, one and only one of its children
may have lost the heap property —but it doesn’t,
because it’s a leaf
The reHeap method IV
 Our tree is once again a heap, because
every node in it has the heap property
22

22 17

19 21 14 15

18 14 11 3 9

 Once again, the largest (or a largest) value is in the root


 We can repeat this process until the tree becomes empty
 This produces a sequence of values in order largest to
smallest
Sorting
 What do heaps have to do with sorting an
array?
 Here’s the neat part:
 Because the binary tree is balanced and left
justified, it can be represented as an array
 All our operations on binary trees can be
represented as operations on arrays
 To sort:
heapify the array;
while the array isn’t empty {
remove and replace the root;
reheap the new root node;
}
Mapping into an25array
22 17

19 22 14 15

18 14 21 3 9 11

0 1 2 3 4 5 6 7 8 9 10 11 12
25 22 17 19 22 14 15 18 14 21 3 9 11
 Notice:
 The left child of index i is at index 2*i+1
 The right child of index i is at index 2*i+2
 Example: the children of node 3 (19) are 7 (18) and
8 (14)
Removing and replacing the
root
 The “root” is the first element in the array
 The “rightmost node at the deepest level” is the last
element
 Swap them...
0 1 2 3 4 5 6 7 8 9 10 11 12
25 22 17 19 22 14 15 18 14 21 3 9 11

0 1 2 3 4 5 6 7 8 9 10 11 12
11 22 17 19 22 14 15 18 14 21 3 9 25
 ...And pretend that the last element in the array
no longer exists—that is, the “last index” is 11 (9)
Reheap and repeat
 Reheap the root node (index 0, containing
0 )...
11 1 2 3 4 5 6 7 8 9 10 11 12
11 22 17 19 22 14 15 18 14 21 3 9 25

0 1 2 3 4 5 6 7 8 9 10 11 12
22 22 17 19 21 14 15 18 14 11 3 9 25

0 1 2 3 4 5 6 7 8 9 10 11 12
9 22 17 19 22 14 15 18 14 21 3 22 25
 ...And again, remove and replace the root node
 Remember, though, that the “last” array index is changed
 Repeat until the last becomes first, and the array is
sorted!
Analysis I
 Here’s how the algorithm starts:
heapify the array;
 Heapifying the array: we add each of
n nodes
 Each node has to be sifted up, possibly
as far as the root
 Since the binary tree is perfectly balanced,
sifting up a single node takes O(log n) time
 Since we do this n times, heapifying
takes n*O(log n) time, that is, O(n log n)
time
Analysis II
 Here’s the rest of the algorithm:
while the array isn’t empty {
remove and replace the root;
reheap the new root node;
}
 We do the while loop n times (actually, n-1
times), because we remove one of the n
nodes each time
 Removing and replacing the root takes
O(1) time
 Therefore, the total time is n times
however long it takes the reheap method
Analysis III
 To reheap the root node, we have to follow
one path from the root to a leaf node (and
we might stop before we reach a leaf)
 The binary tree is perfectly balanced
 Therefore, this path is O(log n) long
 And we only do O(1) operations at each node
 Therefore, reheaping takes O(log n) times
 Since we reheap inside a while loop that we
do n times, the total time for the while loop
is n*O(log n), or O(n log n)
Analysis IV
 Here’s the algorithm again:
heapify the array;
while the array isn’t empty {
remove and replace the root;
reheap the new root node;
}
 We have seen that heapifying takes O(n log
n) time
 The while loop takes O(n log n) time
 The total time is therefore O(n log n) + O(n
log n)
 This is the same as O(n log n) time
The End
Shell Sort: Idea
Donald Shell (1959): Exchange items that are far apart!

Original:
40 2 1 43 3 65 0 -1 58 3 42 4

5-sort: Sort items with distance 5 element:

40 2 1 43 3 65 0 -1 58 3 42 4
Shell
Original: Sort: Example
40 2 1 43 3 65 0 -1 58 3 42 4

After 5-sort:
40 0 -1 43 3 42 2 1 58 3 65 4

After 3-sort:
2 0 -1 3 1 4 40 3 42 43 65 58

After 1-sort:
-1
0 1
0 2
1 3
2 40
3 43
3 43
65
4 40
42 42
65 43
43 58 58
65 65
Shell Sort: Gap Values
 Gap: the distance between items being
sorted.
 As we progress, the gap decreases.
Shell Sort is also called Diminishing
Gap Sort.
 Shell proposed starting gap of N/2,
halving at each step.
 There are many ways of choosing the
next gap.
Shell Insertion
Sort: AnalysisShellsort
N Sort Shell's Odd Gaps Only Dividing by 2.2
1000 122 11 11 9
2000 483 26 21 23
4000 1936 61 59 54
8000 7950 153 141 114
16000 32560 358 322 269
32000 131911 869 752 575
64000 520000 2091 1705 1249

O(N3/2)? O(N5/4)? O(N7/6)?


So we have 3 nested loops, but Shell Sort is still better
than Insertion Sort! Why?
Generic Sort
 So far we have methods to sort integers. What
about Strings? Employees? Cookies?
 A new method for each class? No!
 In order to be sorted, objects should be comparable
(less than, equal, greater than).
 Solution:
 use an interface that has a method to compare two
objects.
 Remember: A class that implements an interface
inherits the interface (method definitions) =
interface inheritance, not implementation
inheritance.
Other kinds of sort

 Heap sort. We will discuss this after


tree.
 Postman sort / Radix Sort.
 etc.

You might also like