Complexity Classes
Big-O (asymptotic) Notation
Some Big-O complexity classes in order of magnitude from smallest
to highest:
O(1) Constant
O(log(n)) Logarithmic
O(n) Linear
O(n log(n)) Log linear
O(nx) {e.g., O(n2), O(n3), etc} Polynomial
O(an) {e.g., O(1.6n), O(2n), etc} Exponential
O(n!) Factorial
Examples of Algorithms and their big-O complexity
Complexity Examples of Algorithms
O(1) Push, Pop, Enqueue (if there is a tail reference),
Dequeue, Accessing an array element, Hash
Table
O(log(n)) Binary search
O(n) Linear search
O(n log(n)) Heap sort, Quick sort (average), Merge sort
O(n2) Selection sort, Insertion sort, Bubble sort
O(n3) Matrix multiplication
O(2n) Towers of Hanoi
For large values of input n , the
constants and terms with lower degree
of n are ignored.
Multiplicative Constants Rule:
Ignoring constant factors.
• O(c f(n)) = O(f(n)), where c is a constant;
• Example:
Rules for O(20 n3) = O(n3)
using 2. Addition Rule: Ignoring smaller
terms.
big-O • If O(f(n)) < O(h(n)) then O(f(n) + h(n)) = O(h(n)).
• Example:
• O(n2 log n + n3) = O(n3)
• O(2000 n3 + 2n ! + n800 + 10n + 27n log n + 5) =
O(n !)
3. Multiplication Rule: O(f(n) * h(n)) =
O(f(n)) * O(h(n))
• Example:
• O((n3 + 2n 2 + 3n log n + 7)(8n 2 + 5n + 2)) = O(n
5
)
Bounds
■ Upper Bound (Big O) <=
■ Lower Bound (Big Omega) >=
■ Tight asymptotic Bound (Big Theta) ==
■ o – exclusive Upper Bound <
■ ω – exclusive Lower Bound >
■ Upper Bound – T(n) = O(n2)
■ Lower Bound – T(n) = Ω (n)
■ Tight Bound – T(n) = n(n-1) / 2 = Θ (n2)
Bounds
■ (n2) = O(n3)
■ (n2) = O(n2)
■ (n2) = Θ(n2)
■ (n2) = Ω(n)
■ (n2) = ω(n)
■ (n2) = ω(n2)
*Tighter bounds are more useful
T(n) = Θ(f(n)) => T(n) = O(f(n)) and T(n) = Ω (f(n))
Asymptotic Notations
Omega (Ω-notation)
• Minimum Repairing Cost >= 5,000
• Exact Repairing Cost = 10,000 Theta (Θ-notation)
• Maximum Repairing Cost = 12,000
Big O
7
O-notation
For function g(n), we define
O(g(n)), big-O of n, as the set:
O(g(n)) = {f(n) :
positive constants c and n0,
such that n n0,
we have 0 f(n) cg(n) }
Intuitively: Set of all functions whose rate
of growth is the same as or lower than that
of g(n).
g(n) is an asymptotic upper bound for f(n).
f(n) = (g(n)) f(n) = O(g(n)).
(g(n)) O(g(n)).
Comp 122
-notation
For function g(n), we define
(g(n)), big-Omega of n, as the
set:
(g(n)) = {f(n) :
positive constants c and n0,
such that n n0,
we have 0 cg(n) f(n)}
Intuitively: Set of all functions whose
rate of growth is the same as or higher
than that of g(n).
g(n) is an asymptotic lower bound for f(n).
f(n) = (g(n)) f(n) = (g(n)).
(g(n)) (g(n)).
-notation
For function g(n), we define
(g(n)), big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2,
and n0, such that n n0,
we have 0 c1g(n) f(n)
c2g(n)
}
Intuitively: Set of all functions that
have the same rate of growth as g(n).
g(n) is an asymptotically tight bound for f(n).
Relations Between Q, O, W
Relations Between Q, W, O
Theorem
Theorem :: ForFor any
any two
two functions
functions g(n)
g(n)
and
and f(n),
f(n),
f(n)
f(n) = (g(n)) iff
= (g(n)) iff
f(n)
f(n) =
= O(g(n))
O(g(n)) and
and f(n)
f(n) =
=
(g(n)).
■ I.e., (g(n)) = O(g(n)) Ç W(g(n))
(g(n)).
■ In practice, asymptotically tight bounds
are obtained from asymptotic upper and
lower bounds.
o-notation
For a given function g(n), the set little-o:
o(g(n)) = {f(n): c > 0, n0 > 0 such that
n n0, we have 0 f(n) < cg(n)}.
f(n) becomes insignificant relative to g(n) as n
approaches infinity:
lim [f(n) / g(n)] = 0
n
g(n) is an upper bound for f(n) that is not
asymptotically tight.
w -notation
For a given function g(n), the set little-omega:
w(g(n)) = {f(n): c > 0, n > 0 such that
0
n n0, we have 0 cg(n) < f(n)}.
f(n) becomes arbitrarily large relative to g(n) as n
approaches infinity:
lim [f(n) / g(n)] =
n
What will happen when answer goes to
and which rule is used to solve this?
g(n) is a lower bound for f(n) that is not
asymptotically tight.
Comparing functions
■ Many of the relational properties of real numbers apply to asymptotic
comparisons as well. For the following, assume that f(n) and g(n) are
asymptotically positive.
Transitivity:
Comparing functions
Reflexivity
Symmetry
Transpose Symmetry
How to check complexities?
O(n): Time Complexity of a loop is considered as O(n) if the loop
variables are incremented/decremented by a constant amount. For
example, the following functions have O(n) time complexity.
// Here c is a positive integer constant
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}
for (int i = n; i > 0; i -= c) {
// some O(1) expressions
}
O(n^c): Time complexity of nested loops is equal to the number of
times the innermost statement is executed. For example, the following
sample loops have O(n^2) time complexity
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
for (int i = n; i > 0; i += c) {
for (int j = i+1; j <=n; j += c) {
// some O(1) expressions
}
For example, Selection sort and Insertion Sort have O(n^2) time
complexity.
O(Logn) Time Complexity of a loop is considered as O(Logn) if the loop
variables are divided/multiplied by a constant amount.
for (int i = 1; i <=n; i *= c) {
// some O(1) expressions
}
for (int i = n; i > 0; i /= c) {
// some O(1) expressions
}
For example, Binary Search has O(Logn) time complexity.
How to determine complexity of code structures
Loops: for, while, and do-while:
Complexity is determined by the number of iterations in
the loop times the complexity of the body of the loop.
Examples:
for (int i = 0; i < n; i++)
sum = sum - i; O(n)
for (int i = 0; i < n * n; i++) O(n2)
sum = sum + i;
i=1;
while (i < n) {
sum = sum + i; O(log n)
i = i*2
}
How to determine complexity of code structures
Nested Loops: Complexity of inner loop * complexity of
outer loop.
Examples:
sum = 0
for(int i = 0; i < n; i++)
for(int j = 0; j < n; j++) O(n2)
sum += i * j ;
i = 1;
while(i <= n) {
j = 1;
while(j <= n){
statements of constant complexity O(n log n)
j = j*2;
}
i = i+1;
}
How to determine complexity of code structures
Sequence of statements: Use Addition
rule
O(s1; s2; s3; … sk) = O(s1) + O(s2) + O(s3) + …
+ O(sk)
= O(max(s1, s2, s3, . . . , sk))
for (int j = 0; j < n * n; j++)
Example:
sum = sum + j;
for (int k = 0; k < n; k++)
sum = sum - l;
System.out.print("sum is now ” + sum);
How to determine complexity of code
structures
Switch: Take the complexity of the most expensive
case
char key;
int[] X = new int[5];
int[][] Y = new int[10][10];
........
switch(key) {
case 'a':
for(int i = 0; i < X.length; i++) o(n)
sum += X[i];
break;
case 'b':
for(int i = 0; i < Y.length; j++)
for(int j = 0; j < Y[0].length; j++)
o(n2)
sum += Y[i][j];
break;
} // End of switch block
Overall Complexity: o(n2)
How to determine complexity of code
structures
If Statement: Take the complexity of the most expensive
case :
char key;
int[][] A = new int[5][5];
int[][] B = new int[5][5];
int[][] C = new int[5][5];
........
if(key == '+') {
for(int i = 0; i < n; i++)
for(int j = 0; j < n; j++) O(n2)
C[i][j] = A[i][j] + B[i][j];
} // End of if block
Overall
complexity
else if(key == 'x')
O(n3)
C = matrixMult(A, B);
O(n3)
else
System.out.println("Error! Enter '+' or 'x'!");
O(1)
How to determine complexity of code
structures
■ Sometimes if-else statements must carefully be checked:
O(if-else) = O(Condition)+ Max[O(if), O(else)]
int[] integers = new int[10];
........
if(hasPrimes(integers) == true)
integers[0] = 20; O(1)
else
integers[0] = -20; O(1)
public boolean hasPrimes(int[] arr) {
for(int i = 0; i < arr.length; i++)
..........
.......... O(n)
} // End of hasPrimes()
O(if-else) = O(Condition) = O(n)
How to determine complexity of code
structures
■ Note: Sometimes a loop may cause the if-else rule not to
be applicable. Consider the following loop:
while (n > 0) {
if (n % 2 = = 0) {
System.out.println(n);
n = n / 2;
} else{
System.out.println(n);
System.out.println(n);
n = n – 1;
}
}
How to determine complexity of code structures
for (int i = 1; i <=n; i++)
for (int j = 1; j <=n; j++)
for (int i = 1; i <=n; i++)
for (int j = 1; j <=n*n; j++)
for (int i = 1; i <=n; i++)
for (int j = i; j <=i+3; j++)
for (int i = 1; i <=n; i++)
for (int j = i; j <=n*n; j*=2)
for (int i = 1; i <=n; i++)
for (int j = i; j <=n; j+=5)
Ops
for (int k = 1; ik<=n/2; k++)
How to determine complexity of code structures
for (int i = 1; i <=n; i++)
for (int j = 1; j <=n; j++)
for (int i = 1; i <=n; i++)
for (int j = 1; j <=n*n; j++)
for (int i = 1; i <=n; i++)
for (int j = i; j <=i+3; j++)
for (int i = 1; i <=n; i++)
for (int j = i; j <=n*n; j*=2)
for (int i = 1; i <=n; i++)
for (int j = i; j <=n; j+=5)
Ops
for (int k = 1; ik<=n/2; k++)
■ Consider an algorithm with an asymptotic
complexity of Θ (n2). If an algorithm takes 3
seconds to process an input of size 104, how much
time will it take to process an input of 105 on the
same machine?
time = # of ops/speed
time1 = n 2/ speed ………………1
1
time2 = n 2/speed ……………….2
2
COMMON
FUNCTIONS
Logarithms
log c (ab) log c a log c b
x = logba is the
exponent for a = bx. log b a n n log b a
log c a
log b a
Natural log: ln a = logea log c b
Binary log: lg a = log2a log b (1 / a ) log b a
1
log b a
log a b
lg2a = (lg a)2
logb c logb a
lg lg a = lg (lg a) a c
Review on Summations
■ Constant Series: For integers a and b, a b,
b
1 b a 1
i a
■ Linear Series (Arithmetic Series): For n 0,
n
n(n 1)
i 1
i 1 2 n
2
■ Quadratic Series:
n For n 0, n(n 1)(2n 1)
2
i2
i 1
2 2
1 2 n
6
Review on Summations
■ Cubic Series: For n 0,
2 2
n
n ( n 1)
i 1
i 3
13
2 3
n 3
4
■ Geometric Series: For real x 1,
n 1
n
x 1
k 0
k 2 n
x 1 x x x
x 1
For |x| < 1,
1
k 0
k
x
1 x