Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
5 views3 pages

CSC150 CH 1 BBig O

The document discusses performance analysis in algorithm design, emphasizing the importance of evaluating trade-offs among different methods to determine the most efficient solution. It covers concepts such as measuring run-time, Big-Oh notation, and the rules for calculating time complexity, highlighting how to compare algorithms based on their growth rates. Additionally, it introduces Omega and Theta notations for lower and tight bounds on growth rates, respectively.

Uploaded by

xieni6827
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views3 pages

CSC150 CH 1 BBig O

The document discusses performance analysis in algorithm design, emphasizing the importance of evaluating trade-offs among different methods to determine the most efficient solution. It covers concepts such as measuring run-time, Big-Oh notation, and the rules for calculating time complexity, highlighting how to compare algorithms based on their growth rates. Additionally, it introduces Omega and Theta notations for lower and tight bounds on growth rates, respectively.

Uploaded by

xieni6827
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

PERFORMANCE ANALYSIS

<I> CRITERIA
- In designing algorithm, we need methods to separate bad algorithms from the good
ones. When we have several possible ways, we need to prove which is the best for
our problem. For this reason, the analysis of algorithms and the comparison of
alternative methods becomes an important part of software engineering.
- Usually, a problem can be solved in many different ways.
- The choice of implementation involves a careful evaluation of the trade-offs
among the various possibilities.
<1> easy to understand, code and debug (for maintenance use later)
- readable, documented, modular
<2> programming effort
<3> efficiency (efficient use of computer resources)
- time (run as fast as possible)
- space (save storage)
<II> T(n) → EFFICIENCY(Time)
<A> Measuring The Run-Time (T(n)) of a Program
/* A function calculate the sum of a list of integer */
Statement Step Frequency Total Steps
float sum(float *list, int n)
{
float temp=0 ; 1 1 1
int i
for (i=0; i<n; i++ ) 1 n+1 N+1
temp+=list[i]; 1 n N
return temp; 1 1 1
}
Total 2n+3
* The Running time (T(n)) depends on :
(1) The size of input/output
(2) Time complexity of the algorithm underlying the program
* T(n)
- Running time when the size of input is n
- We think T(n) as the worst case running time
<B> Big-Oh Notation
* A program’s running time T(n) is O(n) where n is the highest order of n in T(n).
<Example 1>
Consider T(n) = 2n+4 We say that T(n) => O(n)
<Example 2>
Consider T(n) = (n+1) 2 = n2+2n+1 We say that T(n)  O(n2)
<Example 3>
Consider T(n) = 120 We say that T(n)  O(1)
* Growth Rate

1
It shows how fast the running time grows when n increase.
* Common Big-Oh functions.
O(1) Constant Grows Slowest Best
O(log n) Logarithmic
O(n) Linear
O(n log n) n log n
2
O(n ) Quadratic
3
O(n ) Cubic
n
O(2 ) Exponential Grows Fastest Worst
A comparison of different Big-Oh functions
70
O(2n) O(n2)
60

50

40

30 O(n log n)

20
O(n)
10
O(log n)
0 O(1)
1 2 3 4 5 6 7 8 9 10
It tells us how fast the running time grows when n increase
<C> Comparing Programs using Big-Oh
Programs can be evaluated generally by comparing their Big-Oh functions,
- Generally,algorithms of the same Big-Oh can be considered as equally good
E.g. a program with O(n2) is better than one with O(n3)
- Consider which case is considered to be better :
<1> T(n)=100n
<2> T(n)=10n2
- For small size of inputs(n<10) , T(n)=10n2 is better.
- When n increase, T(n)=100n is better.
→ Big-Oh is meaningful only when n is large enough (n>=n0)
where n0=11.
<D> Calculating The Big-Oh For A Program
<i> Rules
Suppose we have 2 program segments P1 and P2, with time complexity
P1 → T1(n) = O(f(n))
P2 → T2(n) = O(g(n))
What is the time complexity of the whole program ?

2
The Rules of Sum
T1(n) + T2(n) → O( max (f(n), g(n)) )
Example T(n) = T1(n) + T2(n) where T1(n)=3n+4, T2(n)=n2 + 8
T(n) = O( max((3n+4),(n2 + 8)) )
T(n) = O(n2 +8)
T(n) = O(n2)
The Rules of Product
T1(n) * T1(n) → O( (f(n)* g(n)) )
Example T(n) = T1(n) * T2(n) where T1(n)=3n+4, T2 (n)=n2 + 8
T(n) = O( (3n+4)*(n2 + 8) )
T(n) = O(3n3 + 24n + 4n2 + 32 )
T(n) = O(n3)
<IV> CONCLUSION
Efficiency
- Different implementations of a problem has its own strengths and weaknesses.
One important consideration in any implementation is efficiency
Programming Effort
- Whether additional work is worthwhile depends on the application
and the number of times the program will be used.
Big-Oh notation (Upper bound of the growth rate)
Given 2 functions f(n) and g(n) ,
Define f(n)=O(g(n)) if f(n) grows no faster than g(n).
Formally, we have f(n)=O(g(n)) iff there exist 2 +ve constant c and n0
such that |f(n)| ≤ c|g(n)| for all n, n≥n0
Example : f(n)=3n3 +2n2
g(n)=n3
n0=1, c=5, → 3n3 +2n2 ≤ 5(n3)
→ 3n3 +2n2 = O(n3)
→ Big-Oh is meaningful only if n is large enough(n≥n0)
Omega notation (Lower bound of the growth rate)
Given 2 functions f(n) and g(n) ,
Define f(n)=Ω(g(n)) if f(n) grows no slower than g(n).
Formally, we have f(n)=Ω(g(n)) iff there exist 2 +ve constant c and n0
such that |f(n)| ≥ c|g(n)| for all n, n≥n0
Example : f(n)=n2 +3n
g(n)=n2
n0=1, c=1, → n2 +3n >= 1(n2)
→ n2 +3n =Ω(n2)
Theta notation (Between Lower and Upper bound)
Given 2 functions f(n) and g(n) ,
Define f(n)=Θ(g(n)) if f(n) and g(n) grow at the same rate.
That is : f(n)= Θ(g(n)) iff f(n)=O(g(n)) and f(n)=Ω(g(n))
Formally, we have f(n)=O(g(n)) iff there exist 2 +ve constant c0, c1 and n0
such that c0|g(n)| ≤ |f(n)| ≤ c1|g(n)| for all n, n≥n0

You might also like