Algorithm Examination Solutions
Question 1: Quicksort and Algorithm Runtime Analysis
(a) Why running times aren't based on system times
Running times of algorithms are not based on computer system's
generated times for three key reasons:
1. Hardware dependency - Different computers have varying
processing speeds
2. Implementation variations - The same algorithm can be coded
differently in different languages
3. Need for platform-independent analysis - Algorithms need to be
compared theoretically regardless of hardware
(b) Quicksort Analysis
(i) Problems with first element as pivot
When taking the first element as pivot, the main problems are:
- If the array is already sorted or reverse sorted, it leads to worst-
case performance
- Creates unbalanced partitions in nearly sorted arrays
- Results in O(n²) time complexity in these cases
(ii) Achieving balanced partitioning
Balanced partitioning can be achieved by:
- Using median-of-three method (first, middle, last elements)
- Randomly selecting the pivot
- Using the middle element as pivot
(iii) Partition Algorithm
```python
def partition normal bracket (arr, low, high):
# Choose rightmost element as pivot
pivot = arr square bracket [high]
# Index of smaller element
i = low - 1
# Compare each element with pivot
for j in range normal bracket (low, high):
# If current element is smaller than pivot
if arr square bracket [j] less down sign, then equal to<= pivot:
# Increment index of smaller element
i += 1
arr square bracket [i], arr square bracket [j] = arr square
bracket [j], arr square bracket [i]
Attention!!!! square brackets are used here for i +1, high etc
# Place pivot in correct position
arr[i + 1], arr[high] = arr[high], arr[i + 1]
return i + 1
```
(iv) Time Complexity Analysis
- Best case: O normal bracket (n log n)
- Occurs when pivot divides array into equal halves
- Each partition takes O normal bracket (n)
- Log n levels of recursion
- Average case: O normal bracket (n log n)
- Occurs with random pivot selection
- Balanced partitioning most of the time
- Similar to best case performance
- Worst case: O normal bracket (n²)
- Occurs with already sorted/reverse sorted array
- Most unbalanced partitioning
- n levels of recursion with O normal bracket (n) work each
Question 2: Algorithms and Big O Notation
(a) Importance of Algorithms in Software Development [3 marks]
1. Efficiency: Algorithms ensure optimal resource utilization
2. Scalability: Well-designed algorithms handle growing data
efficiently
3. Performance: Good algorithms improve program execution speed
(b) Reasons for Using Big O Notation [12 marks]
1. Hardware Independence
- Measures efficiency without hardware specifics
- Provides platform-neutral complexity analysis
2. Focus on Growth Rate
- Shows how performance scales with input size
- Ignores constant factors that vary by implementation
3. Worst-Case Analysis
- Provides upper bounds on resource usage
- Helps in capacity planning and resource allocation
4. Standardization
- Common language for comparing algorithms
- Simplifies algorithm selection decisions
(c) Recursive Sum of Even Numbers [10 marks]
```python
def sumEvenNumbers normal bracket (n):
# Base cases
if n less down sign n equal to <= 0:
return 0
if n == 1:
return 0
# If n is even, include it in sum
if n % 2 == 0:
return n + sumEvenNumbers normal bracket (n - 1)
# If n is odd, skip it
else:
return sumEvenNumbers normal bracket (n - 1)
```
Question 4: Stack and Queue Data Structures
(a) Working Principles
Stack:
- LIFO normal bracket (Last In First Out) structure
- Elements are added and removed from the same end
- Like a stack of plates - last plate placed is first to be removed
Queue:
- FIFO normal bracket (First In First Out) structure
- Elements are added at one end and removed from the other
- Like a line of people - first person in line is first to be served
(b) Operations
Stack Operations:
1. Push - Add element to top
2. Pop - Remove element from top
3. Peek/Top - View top element
4. isEmpty - Check if stack is empty
5. isFull - Check if stack is full
Queue Operations:
1. Enqueue - Add element to rear
2. Dequeue - Remove element from front
3. Front - View front element
4. Rear - View rear element
5. isEmpty/isFull - Check queue status
(c) Implementation
#### (i) Enqueue Implementation
```python
def enqueue normal bracket (queue, maxSize, rear, data):double dot
if rear == maxSize - 1:
return "Queue is full"
rear += 1
queue square bracket[rear] = data
return rear
```
#### (ii) Pop Implementation
```python
def pop normal bracket (stack, top):
if top == -1: double dot
return "Stack is empty"
data = stack square bracket[top]
top -= 1
return data, top
```
## Question 6: Asymptotic Efficiency and Algorithm Design
### a) Basic Asymptotic Efficiency Classes [16 marks]
1. Constant - O(1)
- Execution time stays constant regardless of input size
- Example: Array access, basic arithmetic operations
2. Logarithmic - O(log n)
- Execution time increases logarithmically with input size
- Example: Binary search
3. Linear - O(n)
- Execution time increases linearly with input size
- Example: Linear search
4. Log-linear - O(n log n)
- Combination of linear and logarithmic growth
- Example: Merge sort, Quick sort
5. Quadratic - O(n²)
- Execution time increases with square of input size
- Example: Bubble sort, Selection sort
6. Cubic - O(n³)
- Execution time increases with cube of input size
- Example: Matrix multiplication (naive)
7. Exponential - O(2ⁿ)
- Execution time doubles with each additional input
- Example: Recursive Fibonacci
8. Factorial - O(n!)
- Execution time grows factorially with input size
- Example: Generating all permutations
### b) Omega Notation [4 marks]
Omega notation (Ω) represents the lower bound of an algorithm's
running time. It provides the best case or minimum time required for
an algorithm's execution. For a given function g(n), Ω(g(n)) is the set
of functions f(n) where there exist positive constants c and n₀ such
that 0 ≤ cg(n) ≤ f(n) for all n ≥ n₀.
### c) Algorithm Design Technique [5 marks]
An algorithm design technique is a general approach or strategy
used to solve computational problems systematically. It provides a
framework for developing efficient solutions to problems by
following established patterns and methodologies. These techniques
help in breaking down complex problems into manageable sub-
problems and finding optimal solutions.
## Question 7: Algorithm Efficiency and Problem Types
### a) Types of Algorithm Efficiencies [10 marks]
1. Time Efficiency [5 marks]
- Measures how fast an algorithm runs
- Analyzed using big O notation
- Considers number of basic operations
- Depends on input size
- Critical for performance optimization
2. Space Efficiency [5 marks]
- Measures memory usage of algorithm
- Includes both auxiliary and input space
- Considers variable storage requirements
- Important for memory-constrained systems
- Trades off with time efficiency
### b) Important Problem Types [10 marks]
1. Sorting Problems [2 marks]
- Arranging data in specific order
- Examples: Quick sort, Merge sort
2. Searching Problems [2 marks]
- Finding specific elements in data structures
- Examples: Binary search, Linear search
3. Graph Problems [2 marks]
- Network analysis and pathfinding
- Examples: Shortest path, Minimum spanning tree
4. Dynamic Programming Problems [2 marks]
- Problems with overlapping subproblems
- Examples: Knapsack problem, Fibonacci sequence
5. String Processing Problems [2 marks]
- Text manipulation and pattern matching
- Examples: String matching, Regular expressions
### c) Worst-case Efficiency [5 marks]
Worst-case efficiency represents the maximum time or space
required by an algorithm for any input of size n. It provides an upper
bound on resource usage and is crucial for:
- Guaranteeing performance limits
- Resource allocation planning
- Comparing algorithms reliably
- System design decisions
- Critical application requirements