1 - Tools of Algorithm Analysis
1 - Tools of Algorithm Analysis
Asymptotic Analysis
Chapter 4: Divide-and-Conquer
o Section 4.3: The substitution method for solving recurrences
o Section 4.4: The recursion-tree method for solving recurrences
o Section 4.5: The master method for solving recurrences
o Section 4.6: Proof of the continuous master theorem
(optional/advanced)
o Section 4.7: Akra-Bazzi recurrences (optional/advanced)
o Pages: 76–125
4. Amortized Analysis
Summary Table
Let f(n) and g(n) be functions mapping positive integers to positive real numbers.
1. Big-O:
f(n) = O(g(n)) if ∃ constants c > 0 and n₀ such that:
f(n) ≤ c·g(n) for all n ≥ n₀
2. Big-Ω:
f(n) = Ω(g(n)) if ∃ constants c > 0 and n₀ such that:
f(n) ≥ c·g(n) for all n ≥ n₀
3. Big-Θ:
f(n) = Θ(g(n)) if f(n) is both O(g(n)) and Ω(g(n))
Step 5: Properties
Problem:
Simplify this function and express it in Big-O notation:
3
( )=3 + 5 log + 20
3 3 → cubic term
5 log → linearithmic term
20 → constant term
Step 2: Understand Big-O principles:
3
> log > 1
So, 3 3 dominates the others
3 3
( )=3 + 5 log + 20 = ( )
✅ Final Answer: ( 3
)
2 2
∃ 1, 2, 0 such that: 1 ≤ ( )≤ 2 for all ≥ 0
2
Step 1: Show Upper Bound: ( ) = ( )
2 2 2 2 2
( )=5 +2 +1≤5 +2 + =8
✅ So choose:
2 =8
0 =1
2
⇒ ( )≤8 for all ≥1
2
Step 2: Show Lower Bound: ( ) = Ω( )
2 2
( )=5 +2 +1≥5
✅ So choose:
1=5
0 =1
2
⇒ ( )≥5 for all ≥1
2 2 2
5 ≤ ( )≤8 ⇒ ( ) = Θ( )
✅ Final Answer: Θ( 2
)
List:
( )
(1)
( log )
( 2)
(log )
Step-by-step comparison:
✅ Final Answer:
(1) < (log ) < ( ) < ( log ) < ( 2
)
Statement:
2 2
If ( ) = ( ) and ( ) = ( ), then ( ) + ( ) = ( )
Step 2: Example:
Let:
( )=5
2
( )=3
Then:
2 2
( )+ ( )=5 +3 = ( )
2
(because 3 dominates as → ∞)
log log
1.1 = 0.1
→ 0 as →∞
Step 2: Interpretation:
✅ Conclusion:
1.1 1.1
log = ( )⇒ grows faster
Recurrence relations are equations that define a function in terms of its value on
smaller inputs. They commonly appear when analyzing the time complexity of recursive
algorithms.
( ) = 2 (2) +
✅ Step-by-Step Explanation
( )= ⋅ ( )+ ( )
Where:
a = number of subproblems
n/b = size of each subproblem
( ) = cost of dividing and combining
Method Description
1. Substitution Guess the answer, then prove it by induction
2. Recursion Tree Visualize the work done at each level of recursion
3. Master Theorem Plug into a general formula for divide-and-conquer recurrences
Example 1:
( ) = ( − 1) +
Step 1: Guess:
Let’s guess:
2
( )= ( )
Step 2: Prove using induction
Base case:
(1) = → holds for some constant
Inductive step:
2
Assume ( ) ≤ ⋅
Now show:
2
( + 1) = ( ) + ( + 1) ≤ ⋅ + ( + 1)
We need:
2
⋅ + ( + 1) ≤ ⋅ ( + 1)2
✅ So, proven:
2
( )= ( )
Example 2:
( ) = 2 ( /2) +
Level 0: 1⋅ 1 = 20 ⋅ 20 =
Level 1: 2 ⋅ 2 = 21 ⋅ 21 =
Level 2: 4 ⋅ = 22 ⋅ 2 =
4 2
…
Level log n: 2 ⋅ =
2
( )= + + + ⋯+ = ⋅ log
( ) = 2 ( /2) +
Which means:
You divide the problem of size n into 2 subproblems, each of size n/2
You spend n work combining their results
2 ⋅ ( /2) =
From each n/2 comes two subproblems of size n/4 → total of 4 subproblems:
4 ⋅ ( /4) =
At level iii:
2 ⋅ =
2
=1⇒2 = ⇒ = log 2
2
✅ Conclusion:
Total number of levels in the tree is log 2 + 1, but we often write it as Θ(log )
At each level, the cost is n
So the total cost:
( ) = ⏟+ + +⋯+ = log2
log2 levels
✅ Final Answer:
Visual Summary:
Level 0 (Root):
Level 1:
Level 2:
General Pattern:
At level iii:
Number of nodes: 3
Size of each:
2
Work per node: 2
Total work at level:
3
3 ⋅ = ⋅( )
2 2
= 1 ⇒ = log 2
2
log2
3
( )=∑ ⋅( )
2
=0
Take n out:
log2
3
( )= ⋅ ∑( )
2
=0
3
This is a geometric series with ratio = 2 > 1
3
(2)log2 +1 − 1
( )= ⋅
3
−1
2
3 log2 3
log2 ( ) 0.585
( ) = 2 ≈
2
0.585 1.585
( )= ⋅ = Θ( )
Final Summary:
✅ Final Result:
log2 3 1.585
( ) = Θ( ) ≈ Θ( )
( )= ( )+ ( )
log
Compare ( ) with
Example 3:
( ) = 3 ( /2) +
( )= ⋅ ( )+ ( )
a = 3, b = 2, f (n) = n
log = log2 3 ≈ 1.58496 ≈ 1.58
log2 3
= 1.58
Now compare:
1
( )= =
1 1.58
grows slower than
More formally:
1.08
If = 0.5, then 1.58 − 0.5 = 1.08 >; 1, so = ( ), and so on.
log
( ) = Θ( )
✅ Final Result:
1.58−
Since ( ) = = ( ), we apply Case 1:
log2 3 1.58
( ) = Θ( ) ≈ Θ( )
Intuition:
You're comparing how fast the "combine" step ( ) = grows vs. the recursion
depth cost.
Because ( ) is slower than the critical function log2 , the total cost is
dominated by the recursion, not the combine step.
1.58−
( )= = ( ) ⇒ Case 1
✅ Final Answer:
log2 3
( ) = Θ( )
( ) = 2 ( /2) +
✅ Answer:
( ) = Θ( log )
✅ Q2: Solve using master theorem
2
( ) = 4 ( /2) +
a = 4, b = 2, log = log2 4 = 2
( ) = 2 = Θ( 2 )
2
( ) = Θ( log )
( ) = ( − 1) + 1, (1) = 1
Unfold:
( ) = ( − 1) + 1 = ( − 2) + 2 =. . . = (1) + ( − 1)
= 1 + ( − 1) =
✅ Final Answer:
( ) = Θ( )
We try to prove:
2
( )≤ ⋅
Induction base:
(1) = 1 ≤ ⋅ 12 ⇒ ≥1
Assume for k:
2
( )≤ ⋅
2
( + 1) = ( ) + ( + 1) ≤ ⋅ + ( + 1)
Want:
2
⋅ + ( + 1) ≤ ⋅ ( + 1)2
✅ Proven:
2
( )= ( )
✅ Step-by-Step Explanation
for i in range(n):
for j in range(n):
print(i, j)
Space complexity measures the amount of memory used by the algorithm, including:
Input storage
Output storage
Temporary variables
Call stack (for recursion)
Type Description
Total Space Includes input and output
Auxiliary Space Memory used excluding input (i.e., scratch)
Case Description
Best The minimum time taken on any input
Worst The maximum time taken on any input
Average Expected time over all random inputs
Ignore constants: (3 ) = ( )
Keep dominant term: ( 2 + ) = ( 2 )
Only worst-case unless stated otherwise
for i in range(n):
for j in range(i, n):
print(i, j)
Answer:
Outer loop: n
Inner loop: n − i, which averages to n/2
2
Total: ∑ =1( − ) = ( + 1)/2 = ( )
✅ Final Answer: ( 2
)
def sum(n):
if n == 0:
return 0
return n + sum(n - 1)
Time:
Space:
✅ Q3: True or False: A loop that runs from 1 to 100 has time complexity
(1)
for i in range(100):
print(i)
✅ Answer: True
1. 1( ) = 1000
2. 2 ( ) = log
Let’s analyze:
✅ So:
1( )= ( )
2( )= ( ) (grows faster)
Recurrence: ( ) = 2 ( /2) +
Depth = log , each level = n
Total time: ( log )
Uses auxiliary array in each merge step → ( ) space
Topic 4: Amortized Analysis
Objective:
Amortized analysis provides the average cost per operation over a sequence of
operations — even when individual operations can be expensive.
✅ Step-by-Step Explanation
Even though some operations are costly, the average cost per operation is still small.
You start with an array of size 1. Every time it fills up, you double its size and copy all
elements.
Total Cost:
2 −1
= (1)
Method 2: Accounting Method
We overcharge cheap operations and save credits for expensive ones.
Let’s charge:
✅ Result:
Every operation is charged 3 units, so:
Where:
ΔΦ = Φafter − Φbefore
Define potential:
Φ = stack size
✅ So:
✅ Q1: A dynamic array doubles its size when full. What is the amortized cost
of n appends?
Answer:
(1)
✅ So:
(1) per operation
Solution Sketch:
(1) amortized
Answer:
❌ False
Answer:
Summary Table
Method Concept Result Type
Aggregate Total cost / number of ops Average
Accounting Prepay expensive operations Average
Potential Use energy function to track cost Flexible
1. Asymptotic Analysis
Key Concepts:
Ignore constants and lower-order terms
Focus on dominant term
Compare algorithms using asymptotic behavior
Example:
2 2
3 +4 +7= ( )
Form: ( )= ( / )+ ( )
Space Complexity:
Method Description
Aggregate Total cost over all operations ÷ count
Accounting Overcharge cheap ops to cover costly ones
Potential Use a function to track "energy" saved
Use Cases:
Dynamic arrays
Stacks with MULTIPOP
Incremental algorithms
Example:
Appending n elements to a dynamic array with doubling capacity →
✅ Total time ( ), so amortized time per insert = (1)
✅ Q1: Simplify:
( ) = ( /2) + log
Solution:
Using recursion tree or Master Theorem:
a = 1, b = 2, ( ) = log
Compare with log2 1 = 0 = 1
log = Ω( 0 ), but doesn't satisfy regularity condition
Answer: ( ) = (log 2 )
✅ Q2: Determine time complexity of this code:
Solution:
Answer: ( log )
( ) = 3 ( /4) + log
a = 3, b = 4
log4 3 0.792
( ) = log , compare with ≈
Since:
0.792+
log = Ω( ) for = 0.1
Answer: ( ) = Θ( log )
✅ Q4: A dynamic array doubles its size when full. What’s the amortized cost
of insert?
Solution:
Cost of n insertions = ( )
Amortized = (1)
✅ Q5: Time and space complexity of this function:
def countDown(n):
if n == 0:
return
print(n)
countDown(n - 1)
Solution:
Solution:
2
Dominant term =
2
Answer: ( )
✅ Q7: Use accounting method: Stack allows PUSH, POP, and MULTIPOP(k).
Amortized cost?
Solution:
def rec(n):
if n <= 1:
return
rec(n - 1)
rec(n - 1)
Solution:
Answer: (2 )
Here is a carefully crafted set of True/False and Multiple Choice Questions (MCQs)
covering all four topics in Tools of Algorithm Analysis:
T/F 1:
❌ False
✅ Accessing an array element by index is (1).
T/F 2:
2 3
If ( ) = ( ), then ( ) = ( ) is also valid.
✅ True
2 3 3
Any function that is ( ) is also ( ), because grows faster.
T/F 3:
( )= ( / )+ ( )
T/F 4:
In amortized analysis, the cost of the worst single operation determines the overall
complexity.
❌ False
✅ Amortized analysis averages over a sequence, not based on a single worst-case.
T/F 5:
✅ True
It matches Case 2 of the Master Theorem.
T/F 6:
2
If an algorithm has ( log ) time complexity, then it must be faster than an ( )
algorithm on every input.
❌ False
✅ For small n, 2
may be faster. Asymptotic notation is about large n behavior.
T/F 7:
In a recursion tree, the number of levels is always log 2 for all divide-and-conquer
algorithms.
❌ False
✅ Only when the subproblem size is divided by 2. If it's n/3, then it's log 3 , etc.
T/F 8:
A single expensive operation can make the average amortized cost high.
❌ False
✅ If the expensive operation is rare and distributed across cheap ones, the amortized
cost can remain low.
MCQ 1:
A. Θ( )
B. Ω( )
C. ( )
D. Δ( )
✅ Answer: D
Δ( ) is not an asymptotic notation.
MCQ 2:
A. Θ( log )
B. Θ( 2 )
C. Θ( )
D. Θ(log )
✅ Answer: B
2−
a = 4,b = 2, so log = 2. Since ( ) = = ( ), it's Case 1.
MCQ 3:
A. ( )
B. (log )
C. ( log )
D. (1)
✅ Answer: B
Binary search halves the array every step → log 2 comparisons.
MCQ 4:
✅ Answer: A
Dynamic arrays resize occasionally, but appending is (1) amortized.
MCQ 5:
2
Which of the following must be true about a function f ( ) = ( )?
MCQ 6:
A. Linear search
B. Counting sort
C. Stack with MULTIPOP
D. Quicksort
✅ Answer: C
MULTIPOP can be expensive occasionally but cheap on average.
MCQ 7:
✅ Answer: B
log
Case 2: ( ) = Θ( ). Here, a = 2, b =2, so ( ) = Θ( ).
MCQ 8:
A. (1)
B. (log )
C. ( )
D. ( log )
✅ Answer: C
Merge sort requires a temporary array for merging → uses ( ) auxiliary space.
Here's a second set of advanced-level True/False and Multiple Choice Questions (MCQs) for
graduate-level learners. These questions go beyond memorization and require deeper
understanding, analysis, and abstraction.
T/F 1:
If an algorithm has amortized complexity (1), then its worst-case time complexity for
every individual operation must also be (1).
❌ False
✅ Amortized complexity allows some operations to be expensive, as long as the
average over all operations is ( ).
T/F 2:
✅ True
✅ Little-o ( ) = ( ( )) implies that lim →∞ ( )/ ( ) = 0, which means ( ) ∈
( ( )).
T/F 3:
❌ False
✅ Let’s solve: The number of steps until nnn becomes 1 is loglog
( ) = (√ ) + 1 ⇒ ( ) = Θ(loglog )
T/F 4:
✅ True
✅ If the potential function decreases, then ΔΦ < 0, resulting in negative amortized
cost.
T/F 5:
The Master Theorem cannot be applied to recurrences where the subproblem size is not
a fixed fraction of n, e.g., ( ) = ( − 1) + ( ).
✅ True
✅ The Master Theorem only works for divide-and-conquer forms ( ) = ( / )+
( ), not linear recursions.
MCQ 1:
Which of the following recurrences cannot be solved directly by the Master Theorem?
A. ( ) = 4 ( /2) +
B. ( ) = 2 ( /2) + log
C. ( ) = ( − 1) +
D. ( ) = 8 ( /2) + 3
✅ Answer: C
( ) = ( − 1) + is not of the form required by the Master Theorem.
MCQ 2:
A. ( log )
B. ( log2 3 )
C. ( 2)
D. ( )
✅ Answer: B
1.585
log2 3 ≈ 1.585, so ( ) = Θ( )
MCQ 3:
Which of the following is not a valid technique for solving recurrence relations?
A. Master Theorem
B. Recursive Tree Expansion
C. Recursion Elimination
D. Potential Function
✅ Answer: D
Potential Function is used in amortized analysis, not for solving recurrences.
MCQ 4:
A. ( )
B. ( log )
C. ( loglog )
D. ( log2 3 )
✅ Answer: B
This recurrence solves to Θ( log ) using recursion tree or Akra-Bazzi method.
MCQ 5:
Which best describes the amortized time of incrementing a binary counter from 0 to n?
A. ( log )
B. ( )
C. ( loglog )
D. (log )
✅ Answer: B
Each bit flips once for every power of two → total work for n increments is ( )
MCQ 6:
A. ( log )
B. ( (log )2 )
2
C. ( )
D. ( √log )
✅ Answer: B
Use recursion tree:
MCQ 7:
Which of the following is most appropriate for analyzing an amortized complexity using
energy stored in the data structure?
A. Recursive decomposition
B. Aggregated analysis
C. Potential method
D. Divide and conquer
✅ Answer: C
Potential method assigns energy via a potential function.
MCQ 8:
A. (log )
B. ( )
C. ( log )
D. (2 )
✅ Answer: B
Linear recurrence: one call per level → total work = ( )