Analysis of
Algorithm
LECTURE 4
INSTRUCTOR: DR . HUSNAIN ASHFAQ
Lecture
Contents
Algorithms Efficiency
Measurement of Efficiency
Time & Space Complexity
Time / Space Tradeoff
Tips for Execution Friendly Development
Execution Time Function
Execution Time Comparison for Different Time
Functions
Efficiency
“Algorithm efficiency is the quality of an algorithm that
describes the computational resources
required to run it”
Computational resources
include:
350
Execution Time
300
Space occupancy
250
Memory Requirements
200
150
100
50
1 2 3 4 5 6 7
n T(n)
Algorithm
Efficiency
Less the resources
350
utilized by
algorithm, an
more efficient it is
300
Time & space often go
250
in opposite
200
directions
150
Priority of time and/or space
100
can dictate the choice of
50
algorithm
0
1 2 3 4 5 6 7
In today’s world time is mostly
n 2 4 7 11 16 29 60
T(n) 1 10 21 48 82 178 290
given higher priority than
n T(n)
space
Measuring Algorithm
Efficiency
Two different ways:
1. Measure & Compare Execution Time
2. Measure & Compare Running time (Asymptotic
Analysis)
Note: Running time mostly depends upon input size (n) and is
Measuring Algorithm Efficiency… Measuring
Execution Time
Dependent upon hardware configuration
Dependent upon IO operations
Dependent upon memory in system
Dependent upon other applications installed on system
May differ for same machine on different times
Does not really help to predict affect on execution time when
input size is significantly increased / decreased
May differ greatly for parallel infrastructure
Involve function call overhead
Measuring Algorithm Efficiency… Measuring
Execution Time
Multiple threads trying to access a common resource can increase
execution time significantly
If the program is running on server with multiple disks, some
particular raid configuration might work best for it
Choice of language can increase / decrease the execution time
e.g. C language is faster than Java
Time & Space
Complexity
Time Complexity
Time required to execute an algorithm
Space Complexity
Total memory taken by an algorithm during its
execution
Time & Space Tradeoffs
Time & Space Tradeoffs …
Example
Application data may be stored in arrays, link lists, trees, graphs
etc
For banking / financial transactions time
may
For be compromised
audio/video stream (a bit) but
based problems
accuracy is a mustmay
requirements
preferring solutions withbe
accuracy compromised
low execution time
We want google to respond to our queries promptly while few
irrelevant links may be ignored
Types of
Measurement
Worst Case (BIG O)
Average Case (Theta notation )
Best case (omega notation)
Best, Average & Worst
Cases
For same algorithm, not all inputs take same time to execute
There are input values for which execution time is least (best
cases).
Examples:
incase of sorting data is already sorted
You are looking for a key (linear search) and first element is your
required key
You are looking for a key (binary search) and middle value is your
required key
Best, Average & Worst
Cases
There are input values for which execution time is maximum
(worst cases).
Examples:
We want to sort data in ascending order while it is already in
descending order
You are looking for a key (linear search) and it is not present in
array
You are looking for a key (binary search) and it is not present in
array
Best, Average & Worst
Cases
Average case is present when no prediction is possible about
data
A data value can exist anywhere in the available list of data
It is more difficult to be measured in comparison to best case
or worst case
Example:
Taking n random values and trying to sort them
Asymptotic
Analysis
Independent of hardware, platform & software
Expresses complexity of an algorithm in terms of a known
function related to input size
Analysis describes the growth of running time with reference to
input size
e.g. when n grows then T(n) will grow on order or n log n
Notation for this expression is: T(n) = O(f(n))
Asymptotic Analysis … Constant
Growth O(c)
No growth at all
The runtime does not grow at all as a function of n (constant)
Basically, it is any operation that does not depend on the value
of n to do its job
Has the slowest growth pattern (none!)
Examples:
1. Accessing an element of array
2. Accessing maximum value from a
MAX HEAP
3. Accessing header node from a link
list
4. Accessing root node from a tree
5. Hashing
Asymptotic Analysis … Logarithmic Growth
O(log n)
Logarithmic Growth
The runtime growth is proportional to the base 2
logarithm (log) of n
Examples:
1. Binary Search
2. Max/Min value from a complete
binary tree
Asymptotic Analysis … Linear
Growth O(n)
Linear Growth
Runtime grows proportional to the
value of n
Examples:
1. Linear Search
2. Max/Min value from an
array
3. Sum of value from an
array
4. Link list traversal
Asymptotic Analysis … O(n
log n)
(n log n) Growth
Any sorting algorithm that uses comparisons between
elements is O(n log n), based on divide an conquer
approach
Examples:
1. Merge
Sort
2. Quick
Sort
Asymptotic Analysis …
O(n2)
(n2) Growth
Running Time grows
rapidly
Slow sorting
algorithms Examples:
1. Bubble Sort
2. Insertion Sort
3. Selection Sort
4. Quick Sort (Worst
Case)
Asymptotic Analysis … Polynomial
Growth O(nk)
(nk) Growth
Running Time grows
rapidly
Suitable for small n
Examples:
1. Matrix multiplication
2. Maximum matching for
bipartite graph
3. Multiplying n-digit numbers by
simple algorithm
Asymptotic Analysis … Polynomial
Growth O(2k)
(2k) Growth
Running Time grows extremely
rapidly
Suitable only for very small n
Examples:
1. Exact solution for travelling
salesman
problem
2. Brute force search problems
Asymptotic Analysis … Graph for n
(1-15)
log (n) vs n
16
14
12
10
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
log (n) n
Asymptotic Analysis … Graph for n
(1-15)
n vs n log (n)
70
60
50
40
30
20
10
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14
15
n
n log (n)
Asymptotic Analysis … Graph for n
(1-15)
n log (n) vs n^2 vs n^3
4000
3500
3000
2500
2000
1500
1000
500
0
1 2 3 4 5 6 8 9 10 11 12 13 14 15
7
n^2 n^3
n log (n)
Asymptotic Analysis … Graph for n
(1-15)
n^3 vs 2^n
35000
30000
25000
20000
15000
10000
5000
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n^3 2^n
Asymptotic Analysis … Graph for n
(1-100)
log n – n – n log n – n2 – n3-2n
35000
30000
25000
20000
15000
10000
5000
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
n log (n) n n log (n) n^2 n^3 2^n
Execution Time Functions (Running
Time)
Functions
Constant
log n
n
n log n
n2
n3
2n
nn
Running time functions are listed above in
ascending order
Execution Time
Functions ….
Execution Example Problems
Time
Function
n2 Bubble Sort, Insertion Sort, Selection Sort
n log(n) Merge Sort, Quick Sort, Heap Sort, Huffman
Encoding
n+k Bucket Sort
nk Radix Sort
n3 Matrix Multiplication
log n Binary Search (sorted data)
n Linear Search
How to Calculate time Complexity
(in Big O notation)
Examples
Int X= 15- (35/2) (linear statement)
Independent of n, the time to execute this statement is constant. If you vary value of input size (n) this will not effect this single
statement
When we have constant time the its complexity is big O of 1 or order of 1: O(1)
X= 15- (35/2); O(1)
Cout<< x; O(1)
y= 5*9; O(1)
Cout<< y; O(1)
Total = O(1) + O(1) +O(1) + O(1) = 4O(1)
Implement rule 2: ignore constant then we left with O(1), means to implement all these statement the total time will be O(1).
Order of 1 is also term as Constant
For (i=0; i<n; i++){
Cout<< I; O(1)
}
We need to find how many times the cout statement will be executed
Lets assume N = 5
Total time= n*O(1) = n*K or k*n (here k is constant)
Implement rule 2: ignore constant then we left with n is our complexity will be O(n),
Cout << “Enter the value of n”; O(1)
Cin>> n; O(1)
For (i=0; i<n; i++){
Cout<< I; O(1) n*O(1) or K*n
}
Total = O(1) + O(1) + kO(n) = 2O(1) + kO(n)
Implement rules. What will be the answer
Nested for loop
Nested for loop
Y = 4*5;
for (i=1; i≤ n; i++){
cout<< I;}
for (x=1; x≤ n; x++){
for (y=1; y≤ n; y++)
cout << x*y;}
If Else Statements
If (x>10){
Cout<< “large”;}
Else { Cout<<“small”
}
For conditional statements the complexity is
calculated by the branch which higher
complexity. Here both branches are O(1)
Practice Problems
Practice Problems
Practice Problems
Suppose N= 5
Practice Problems