Bilgisayar Mühendisliği Bölümü
DATA STRUCTURES AND
ALGORITHMS
Asymptotic Notations
GTU– Computer Engineering Department
The Process of Algorithm Development
▪ Design
– divide&conquer, greedy, dynamic programming
▪ Validation
– check whether it is correct
▪ Analysis
– determine the properties of algorithm
▪ Implementation
▪ Testing
– check whether it works for all possible cases
GTU – Computer Engineering Department 2
Analysis of Algorithm
▪ Analysis investigates
– What are the properties of the algorithm?
• in terms of time and space
GTU – Computer Engineering Department 3
Properties of an Algorithm
▪ Used memory space
– Number of bits
– Number of elements
▪ Running time
– Wall-clock time in seconds
– Number of operations
– Number of most important operation
• basic operation
▪ They are investigated as a function T(n) of a
parameter n indicating problem’s size
GTU – Computer Engineering Department
Properties of an Algorithm
▪ The properties can be calculated
– Emprically → after implementation
– Theoretically → before implementation
▪ If there are many algorithms ideas, it is better
to evaluate them without implementation
– Exact calculation is not easy
GTU – Computer Engineering Department
Table Method
▪ Table Method is used to calculate complexity of an algorithm
▪ Example :
Adding elements of an array
steps/ freq total
exec
sum = 0 1 1 1
for i = 1 to n 2 n+1 2n+2
sum = sum + a[i] 2 n 2n
report sum 1 1 1
4n+4
6
GTU – Computer Engineering Department
Table Method
▪ Example :
Matrix addition
Assume a, b, c are mxn matrices
steps/ freq total
exec
for i = 1 to m 1 m+1 m+1
for j = 1 to n 1 m(n+1) mn+m
c[i,j] = a[i,j] + b[i,j] 1 mn mn
2mn+2m+1
GTU – Computer Engineering Department
Analysis of Algorithm
▪ Analysis investigates
– What are the properties of the algorithm?
• in terms of time and space
– How good is the algorithm ?
• whether satisfies our needs or not
– How it compares with others?
• not always exact
– Is it the best that can be done?
• difficult !
GTU – Computer Engineering Department 8
Comparison of Algorithms
▪ Assume the running times of two algorthms are
calculated:
For input size N
Running time of Algorithm A = TA(N) = 1000 N
Running time of Algorithm B = TB(N) = N2
Which one is faster ?
GTU – Computer Engineering Department 9
Comparison of Algorithms
If the unit of running time of algorithms A and B is µsec
N TA TB
10 10-2 sec 10-4 sec
100 10-1 sec 10-2 sec
1000 1 sec 1 sec
10000 10 sec 100 sec
100000 100 sec 10000 sec
So which algorithm is faster ?
GTU – Computer Engineering Department 10
Comparison of Algorithms
T (Time)
TB
TA
If N<1000 TA(N) > TB(N)
o/w TB(N) > TA(N)
N (Input size)
1000
Compare their relative growth
GTU – Computer Engineering Department 11
Properties of an Algorithm
▪ Is it always possible to calculate exact value
in seconds?
NO !
Run-time is effected by
• Compiler
• O/S usually effects the constants & lower order terms
• computer
• algorithm use asymtotic notation
Compare their relative growth
12
GTU – Computer Engineering Department
Comparison of Algorithms
▪ Is it always possible to have definite results?
NO !
The running times of algorithms can change
because of the platform, the properties of the
computer, etc.
We use asymptotic notations (O, Ω, θ, o)
• compare relative growth
– No constants
– No lower order terms
• compare only algorithms
GTU – Computer Engineering Department 13
Asymptotic Notations
▪ Used to express running time and memory
space used.
For input size N
Running time of Alg. A = TA(N) = 1000 N = O(N)
Running time of Alg. B = TB(N) = 7N2+N = O(N2)
GTU – Computer Engineering Department 14
Big Oh Notation (O)
Provides an “upper bound” for the function f
▪ Definition :
T(N) = O (f(N)) if there are positive constants c
and n0 such that
T(N) ≤ cf(N) when N ≥ n0
– T(N) grows no faster than f(N)
– growth rate of T(N) is less than or equal to growth rate
of f(N) for large N
– f(N) is an upper bound on T(N)
• not fully correct !
GTU – Computer Engineering Department 15
Big Oh Notation (O)
▪ Analysis of Algorithm A
TA (N) = 1000 N = O(N)
1000 N ≤ cN N n0
if c= 2000 and n0 = 1 for all N
TA (N) = 1000 N = O(N) is right
GTU – Computer Engineering Department 16
Examples
▪ 7n+5 = O(n)
for c=8 and n0 =5
7n+5 ≤ 8n n>5 = n0
▪ 7n+5 = O(n2)
for c=7 and n0=2
7n+5 ≤ 7n2 n≥n0
▪ 7n2+3n = O(n) ?
GTU – Computer Engineering Department 17
Advantages of O Notation
▪ While comparing two algorithms based on
their running times
▪ Constants can be ignored.
– Units are not important
O(7n2) = O(n2)
▪ Lower order terms are ignored
– Compare relative growth only
O(n3+7n2+3) = O(n3)
GTU – Computer Engineering Department 18
Big Oh Notation (O)
Running Times of Algorithm A and B
TA(N) = 1000 N = O(N)
TB(N) = N2 = O(N2)
A is asymptotically faster than B !
GTU – Computer Engineering Department 19
Omega Notation (Ω)
▪ Definition :
T(N) = Ω (f(N)) if there are positive
constants c and n0 such that T(N) ≥ c f(N)
when N≥ n0
– T(N) grows no slower than f(N)
– growth rate of T(N) is greater than or equal to
growth rate of f(N) for large N
– f(N) is a lower bound on T(N)
• not fully correct !
GTU – Computer Engineering Department 20
Omega Notation
Example:
▪ n 1/2 = W (lg n) .
for c = 1 and n0 = 16
Let n > 16
c*(lg n) ≤ n 1/2
GTU – Computer Engineering Department 21
Omega Notation
▪ Theorem:
f(N) = O(g(n)) <=> g(n) = Ω(f(N))
Proof:
f(N) ≤ c1g(n) <=> g(n) ≥ c2f(N)
divide the left side with c1
1/c1f(N) ≤ g(n) <=> g(n) ≥ c2f(N)
if we choose c2 as 1/c1 then theorem is right.
GTU – Computer Engineering Department 22
Omega Notation
▪ 7n2 + 3n + 5 = O(n4)
▪ 7n2 + 3n + 5 = O(n3)
▪ 7n2 + 3n + 5 = O(n2)
▪ 7n2 + 3n + 5 = Ω(n2)
▪ 7n2 + 3n + 5 = Ω(n)
▪ 7n2 + 3n + 5 = Ω(1)
n2 and 7n2 + 3n + 5 grows at the same rate
7n2 + 3n + 5 = O(n2) = Ω(n2) = θ (n2)
GTU – Computer Engineering Department 23
Theta Notation (θ)
▪ Definition :
T(N) = θ (h(N)) if and only if
T(N) = O(h(N)) and T(N) = Ω(h(N))
– T(N) grows as fast as h(N)
– growth rate of T(N) and h(N) are equal for
large N
– h(N) is a tight bound on T(N)
• not fully correct !
GTU – Computer Engineering Department 24
Theta Notation
▪ 7n2 + 3n + 5 = O(n4)
▪ 7n2 + 3n + 5 = O(n3)
▪ 7n2 + 3n + 5 = O(n2)
▪ 7n2 + 3n + 5 = θ (n2) → best (more information)
▪ 7n2 + 3n + 5 = Ω(n2)
▪ 7n2 + 3n + 5 = Ω(n)
▪ 7n2 + 3n + 5 = Ω(1)
GTU – Computer Engineering Department 25
Little o Notation (o)
▪ Definition :
T(N) = o(p(N)) if
T(N) = O(p(N)) and T(N)≠θ(p(N))
– p(N) grows strictly faster than T(N)
– growth rate of T(N) is less than the growth rate
of p(N) for large N
– p(N) is an upperbound on T(N) (but not tight)
• not fully correct !
GTU – Computer Engineering Department 26
Little o Notation (o)
▪ Example :
T(N) = 3N2
T(N) = o(N4)
T(N) = o(N3)
T(N) ≠ o(N2) -> !!!!
T(N) = θ(N2)
GTU – Computer Engineering Department 27
Some Rules
▪ RULE 1:
if T1(N) = O(f(N)) and T2(N) = O(g(N)) then
a) T1(N) + T2(N) = max (O(f(N)), O(g(N)))
b) T1(N) * T2(N) = O(f(N) * g(N))
You can prove these ?
Is it true for θ notation ?
What about Ω notation?
GTU – Computer Engineering Department 28
Some Rules
▪ RULE 2:
if T(N) is a polynomial of degree k
T(N) = akNk + ak-1Nk-1 + … + a1N + a0
then
T(N) = θ(Nk)
GTU – Computer Engineering Department 29
Some Rules
▪ RULE 3:
logk N = o(N) for any constant k
logarithm grows very slowly !
GTU – Computer Engineering Department 30
Some Common Functions
▪ c = o (log N) => c=O(log N) but
c≠Ω(logN)
▪ log N = o(log2 N)
▪ log2 N = o(N)
▪ N = o(N log N)
▪ N = o (N2)
▪ N2 = o (N3)
▪ N3 = o (2N)
GTU – Computer Engineering Department 31
Example
▪ T(N) = 4N2
• T(N) = O(2N2)
correct but bad style
T(N) = O(N2)
drop the constants
• T(N) = O(N2+N)
correct but bad style
T(N) = O(N2)
ignore low order terms
GTU – Computer Engineering Department 32
Another Way to Compute Growth Rates
f(N )
lim =0 f ( N ) = o( g( N ))
N → g ( N )
=c0 f ( N ) = ( g ( N ))
= g ( N ) = o( f ( N ))
= oscilate there is no relation
GTU – Computer Engineering Department 33
Example
▪ f(N) = 7N2 g(N) = N2 + N
GTU – Computer Engineering Department 34
Example
▪ f(N) = N logN g(N) = N1.5
compare logN with N0.5
compare log2N with N
compare log2N with o(N)
N logN = o(N1.5)
GTU – Computer Engineering Department 35
General Rules to Analysis
▪ RULE 1 : For Loops
The running time of a for loop is at most the running time of the
statements in the for loop times the number of iterations
Example :
int i, a = 0;
for (i=0; i<n; i++)
{
print i;
a=a+i; T(n) = θ(n)
}
return i;
GTU – Computer Engineering Department 36
General Rules to Analysis
▪ RULE 2 : Nested Loops
Analyze nested loops inside out
Example :
for (int i=1;i<=q;i++)
{
for (int j=1;j<=r;j++)
Ө(r) Ө(q)
k++;
}
T(n) = θ(r*q)
37
GTU – Computer Engineering Department
General Rules to Analysis
▪ RULE 3 : Consequtive Statements
Add the running times
for …
…; θ(N)
for … θ(N2)
for … θ(N2)
…;
GTU – Computer Engineering Department 38
General Rules to Analysis
▪ RULE 4 : If / Else
if (condition) T3(n)
S1; T1(n) T(n)
else
S2; T2(n)
Running time is never more than the running time of the test plus
larger of the running times of S1 and S2
(may overestimate but never underestimates)
T(n) ≤ T3(n) + max (T1(n), T2(n))
GTU – Computer Engineering Department 39
General Rules to Analysis
Tworst ( N ) = max{T ( I )} → usually used
|I |= N
Tav ( N ) = T ( I ). Pr(I )
|I |= N
Tbest ( N ) = min{T ( I )}
|I |= N
Tworst ( N ) Tav ( N ) Tbest ( N )
T (n) = O(Tworst (n)) = W(Tbest (n))
GTU – Computer Engineering Department 40
General Rules to Analysis
▪ RULE 4 : If / Else
if (condition) T3(n)
S1; T1(n) T(n)
else
S2; T2(n)
Tw (n) = T3(n) + max (T1(n), T2(n))
Tb (n) = T3(n) + min (T1(n), T2(n))
Tav (n) = p(T)T1(n) + p(F)T2(n) + T3(n)
p(T) → p (condition = True)
p(F) → p (condition = False)
GTU – Computer Engineering Department 41
General Rules to Analysis
▪ Example :
if (condition) T3(n) = θ(n)
S1; T1(n) = θ(n2) T(n)
else
S2; T2(n) = θ(n)
Tw (n) = T3 (n) + max (T1 (n) , T2 (n)) = θ(n2)
Tb (n) = T3 (n) + min (T1 (n), T2 (n)) = θ(n)
if p(T) = p(F) = ½
Tav (n) = p(T)T1 (n) + p(F)T2 (n) + T3 (n) = θ(n2)
T(n) = O (n2)
= Ω (n)
GTU – Computer Engineering Department 42
RECURSIVE CALLS
Example:
Algorithm for computing factorial
int factorial (int n)
{
if (n<=1)
return 1; 1
else
return n*factorial(n-1); 1 for multiplication
} + 1 for substraction
+ cost of evaluation of
factorial(n-1)
T(n) = cost of evaluation of factorial of n
T(n) = 2 + T(n-1)
T(1) =1
GTU – Computer Engineering Department 43
RECURSIVE CALLS
T(n) = 2 + T(n-1)
T(n) = 2 + 2+ T(n-2)
T(n) = 2 + 2 + 2 + T(n-3)
.
.
.
T(n) = k*2 + T(n-k) k= n-1 =>
T(n) = (n-1)*2 + T(n-(n-1))
T(n) = (n-1)*2 + T(1)
T(n) = (n-1)*2 + 1
T(n) = θ (n)
GTU – Computer Engineering Department 44