Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
9 views10 pages

Unit 4: Greedy Algorithms (6 HRS) : 4.1 Basics

The document outlines the fundamentals of greedy algorithms, including definitions, optimization problems, and key properties like Greedy Choice Property and Optimal Substructure. It details specific greedy problems such as the Fractional Knapsack Problem, Job Sequencing with Deadlines, Minimum Spanning Tree algorithms (Kruskal's and Prim's), and Dijkstra's algorithm for shortest paths, along with their respective algorithms and time complexities. Additionally, it covers Huffman coding and its algorithm for generating prefix codes, providing examples and complexity analyses for each topic.

Uploaded by

jonespartick42
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views10 pages

Unit 4: Greedy Algorithms (6 HRS) : 4.1 Basics

The document outlines the fundamentals of greedy algorithms, including definitions, optimization problems, and key properties like Greedy Choice Property and Optimal Substructure. It details specific greedy problems such as the Fractional Knapsack Problem, Job Sequencing with Deadlines, Minimum Spanning Tree algorithms (Kruskal's and Prim's), and Dijkstra's algorithm for shortest paths, along with their respective algorithms and time complexities. Additionally, it covers Huffman coding and its algorithm for generating prefix codes, providing examples and complexity analyses for each topic.

Uploaded by

jonespartick42
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

DAA-4

Sunday, April 27, 2025 1:36 PM

Unit 4: Greedy Algorithms (6 Hrs)


4.1 Basics
• Define greedy algorithm.
• What is an optimization problem?
• When does greedy strategy provide optimal solutions?
• What is Greedy Paradigm?

4.2 Greedy Problems & Algorithms


1. Fractional Knapsack Problem
• Explain the fractional knapsack problem with algorithm and time complexity.
2. Job Sequencing with Deadlines
• Write down the job sequencing with deadlines algorithm.
• Analyze its time complexity.
• Solve:
Jobs = {1,2,3,4,5},
Profits = {20,10,5,15,1},
Deadlines = {2,1,3,2,3}.
3. Minimum Spanning Tree (MST)
• Explain Kruskal’s algorithm for MST with example.
• Explain Prim’s algorithm for MST, trace it for a graph, and analyze its complexity.
4. Single Source Shortest Path
• Write Dijkstra’s algorithm for single source shortest path with example.
• What is the shortest path problem?

4.3 Huffman Coding


• What is Huffman coding?
• What is a prefix code? How does Huffman algorithm generate prefix codes?
• Write Huffman coding algorithm and analyze its complexity.
• Solve:
Find Huffman codes for:
{a:30, b:20, c:25, d:15, e:35}
and
for string "CYBER CRIME".
• Solve for characters {p,q,r,s,t,u,v} with frequencies {40,20,15,12,8,3,2}.

Optimization Problem
• An optimization problem is a problem where you want to maximize or minimize a certain value.
• It has:
○ A set of inputs (constraints or rules).
○ A goal to find the best possible solution.
Example:
• Find the shortest path from one city to another. (Minimize distance)
• Fill a bag with items to maximize profit without exceeding weight. (Maximize value)

Optimal Solution
• An optimal solution is the best possible answer among all feasible solutions.
• It satisfies all constraints and optimizes (maximizes or minimizes) the goal.
Example:
• In knapsack problem, an arrangement of items that gives maximum total profit without crossing weight
limit.
• In Dijkstra's algorithm, the shortest path found from source to each vertex is the optimal solution.

Greedy Algorithms
• A greedy algorithm makes decisions step-by-step, picking the best option at the current moment without
worrying about future consequences.
• It builds the solution part-by-part, always choosing the locally optimal choice.
• Once a decision is made, it is never reconsidered.

Key properties :
• Greedy Choice Property:
A global optimum can be achieved by choosing local optimums at each step.
• Optimal Substructure:
An optimal solution to the overall problem includes optimal solutions to subproblems.
If both properties exist, greedy algorithms guarantee an optimal solution.

Fractional Knapsack Problem


Problem Statement:
You are given:
• A set of items, each with:
○ a weight and
○ a value.
• A knapsack with a maximum weight capacity.
You must maximize the total value you put in the knapsack.
You can break items into smaller pieces (i.e., take fractions of an item).

Key Idea:
• Take items with the highest value/weight ratio first.
• If the full item fits, take it.
• If not, take the fraction that fits.
This is where "greedy" comes in — always pick the best option at the moment.

Algorithm Steps:
1. Calculate value/weight ratio for each item.
2. Sort all items by value/weight ratio in descending order.
3. Initialize total value = 0, total weight = 0.
4. For each item:
○ If the item can fully fit:
 Add its full value.
 Increase weight.
○ Else:
 Take the fraction that fits.
 Add proportional value.
 Knapsack is now full → Stop.
5. Return the total value.

Pseudocode :
FractionalKnapsack(items, capacity):
- For each item, calculate value/weight ratio
- Sort items by value/weight ratio (highest first)

totalValue = 0
remainingCapacity = capacity

for each item in sorted items:


if item's weight <= remainingCapacity:
take full item
totalValue += item's value
remainingCapacity -= item's weight
else:
take fraction of item
totalValue += item's value * (remainingCapacity / item's weight)
break // knapsack is full

return totalValue

Time Complexity:
• Sorting items: O(n log n)
• Taking items: O(n)
Overall:
O(n log n)

Small Example:
Suppose you have:

Item Value Weight


1 60 10
2 100 20
3 120 30
Knapsack capacity = 50.
• Value/Weight:
○ Item 1: 60/10 = 6
○ Item 2: 100/20 = 5
○ Item 3: 120/30 = 4
Sort in descending order: Item 1, Item 2, Item 3.
Take:
• Full Item 1: Weight = 10, Value = 60
• Full Item 2: Weight = 20, Value = 100
• Remaining capacity = 50 - 30 = 20
• Take 20/30 fraction of Item 3:
○ Value = 120 × (20/30) = 80
Total value = 60 + 100 + 80 = 240.

Job Sequencing with Deadlines (Greedy Approach)


The Job Sequencing with Deadlines problem is a scheduling problem where you are given jobs with deadlines and
profits. Each job has a deadline by which it should be completed. The goal is to maximize the profit by scheduling
jobs in such a way that they are completed within their deadlines.
Problem Explanation:
• Given:
○ A set of jobs, each with a profit and deadline.
○ The jobs can only be done once and must be completed before or on their deadline.
• Objective:
○ Schedule jobs to maximize profit, while ensuring that the jobs are completed on time.

Steps to Solve:
1. Sort the jobs by profit in decreasing order (we prioritize higher-profit jobs).
2. Assign jobs to the latest possible time slot before its deadline.
3. If a slot is already taken, move to the next available slot.
Pseudocode:
JobSequencingWithDeadlines(jobs, n):
- Sort jobs by profit in decreasing order

Create an array `result[]` to store the job sequence.


Create a boolean array `slots[]` to track available time slots.

for each job in sorted jobs:


for t = job.deadline to 1 (reverse):
if slot t is available:
Assign job to slot t
Add job's profit to the total profit
Mark slot t as occupied
Break out of the loop
return totalProfit and result (the job sequence)

Example:
Given the following jobs with their respective profits and deadlines:

Job Profit Deadline


J1 20 2
J2 15 1
J3 10 2
J4 5 3
J5 1 3
Steps:
1. Sort jobs by profit:
Sorted jobs:
○ J1 (Profit: 20, Deadline: 2)
○ J2 (Profit: 15, Deadline: 1)
○ J3 (Profit: 10, Deadline: 2)
○ J4 (Profit: 5, Deadline: 3)
○ J5 (Profit: 1, Deadline: 3)
Sorted jobs:
○ J1 (Profit: 20, Deadline: 2)
○ J2 (Profit: 15, Deadline: 1)
○ J3 (Profit: 10, Deadline: 2)
○ J4 (Profit: 5, Deadline: 3)
○ J5 (Profit: 1, Deadline: 3)
2. Schedule the jobs:
○ Start with J1: Deadline is 2, slot 2 is available, so schedule J1.
○ Next, J2: Deadline is 1, slot 1 is available, so schedule J2.
○ Now, J3: Deadline is 2, slot 2 is taken, but slot 1 is also taken. J3 can't be scheduled.
○ J4: Deadline is 3, slot 3 is available, so schedule J4.
○ J5: Deadline is 3, slot 3 is taken, so J5 can't be scheduled.
Result:
• Scheduled jobs: J1, J2, J4
• Total Profit = 20 + 15 + 5 = 40
Complexity:
• Time Complexity:
○ Sorting the jobs takes O(n log n).
○ Scheduling the jobs takes O(n * d) where d is the maximum deadline. And generally we take d=n. so
O(n^2)
○ So, overall O(n^2)
• Space Complexity:
○ The space complexity is O(n) for the result and slots array.
Complexity Analysis:
• Time Complexity:
○ Sorting the edges takes O(E log E) where E is the number of edges.
○ Union-Find operations (with path compression and union by rank) take O(α(V)) where α is
the inverse Ackermann function and V is the number of vertices. This is almost constant
time for practical inputs.
○ Overall time complexity: O(E log E).
• Space Complexity:
○ The space complexity is O(V + E) because we store the edges, vertices, and disjoint-set data
structure.
Time complexity :
• Time Complexity = O(E log V)
Why?
• Each edge causes at most one heap operation (insert/update).
• Heap operations (insert, decrease key, extract min) take O(log V) time.
• Total edges = E, so total time = E × log V = O(E log V).
Time Complexity
When using Adjacency Matrix + Linear Search:
• For each vertex, you scan all other vertices to find the minimum distance vertex.
• Work per step: O(V)
• Total steps: V
Thus,
Total Time = O(V × V) = O(V²).

You might also like