0% found this document useful (0 votes)
18 views99 pages

Greedy Algorithms: Concepts and Examples

The document discusses the greedy strategy in optimization problems, highlighting its characteristics such as local optimal choices, irrevocable decisions, and optimal substructure. It covers applications of the greedy method, including the fractional knapsack problem, job scheduling, and Huffman coding, while also noting that the greedy approach does not always guarantee an optimal solution. The document provides algorithms and examples to illustrate the greedy method's effectiveness in various scenarios.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views99 pages

Greedy Algorithms: Concepts and Examples

The document discusses the greedy strategy in optimization problems, highlighting its characteristics such as local optimal choices, irrevocable decisions, and optimal substructure. It covers applications of the greedy method, including the fractional knapsack problem, job scheduling, and Huffman coding, while also noting that the greedy approach does not always guarantee an optimal solution. The document provides algorithms and examples to illustrate the greedy method's effectiveness in various scenarios.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

Unit 3

Greedy strategy

- Dr. Varsha [Link]


Contents
• General approach,
• fractional knapsack problem using greedy
method,
• Job scheduling/sequencing using greedy
approach examples,
• Huffman coding,
• minimum spanning tree-Kruskal’s and Prim’s
algorithm
Greedy Approach
• The greedy method is one of the strategies like Divide and
conquer used to solve the problems.
• This method is used for solving optimization problems.
• An optimization problem is a problem that demands either
maximum or minimum results
• The greedy approach/technique, is an algorithmic strategy
used in optimization problems
• The Greedy Method is a problem-solving technique where we
build the solution step by step, choosing the option that looks
best at the current moment (locally optimal choice), hoping it
will lead to the overall best solution (globally optimal).
Characteristics of Greedy method
Greedy Choice Property – A locally optimal choice at each step leads to a globally optimal solution.
Optimal Substructure – An optimal solution of the problem contains optimal solutions to its sub-problems.
1. Local Optimal Choice

At each step, pick the choice that seems best right now.
Example: always take the largest coin first while making change.

2. Irrevocable Decisions

Once you make a choice, you don’t go back to change it later.


No backtracking.

3. Feasible Solution

Each choice should keep the solution valid (must satisfy problem constraints).
Example: when scheduling jobs, pick only jobs that can fit into the timeline.

4. Optimal Substructure Property

The global solution can be formed by combining solutions to smaller subproblems.


Example: shortest path in a graph → shortest path from A to C via B is built from shortest path A→B and B→C.

5. Greedy-Choice Property

There exists a way to reach an optimal solution by making greedy choices at each step.
If this property doesn’t hold, greedy method may fail.
Characteristics of Greedy method
1. Local Optimal Choice
At each step, pick the choice that seems best right now.
Example: always take the largest coin first while making change.

2. Irreversible Decisions
Once you make a choice, you don’t go back to change it later.
No backtracking.

3. Feasible Solution
Each choice should keep the solution valid (must satisfy problem
constraints / rules).
Example: when scheduling jobs, pick only jobs that can fit into the
timeline.
Characteristics of Greedy method
4. . Optimal Substructure Property
The global solution can be formed by combining
solutions to smaller subproblems.
Example: shortest path in a graph → shortest path
from A to C via B is built from shortest path A→B and
B→C.

5. Greedy-Choice Property
There exists a way to reach an optimal solution by
making greedy choices at each step.
Will greedy always give optimal solution-
no
• Here we have to find largest path
Will greedy always give optimal solution
• The greedy algorithm is not always the optimal solution for every
optimization problem, as shown in the example below.
• the greedy approach to make change for the amount 20 with the coin
denominations [18, 1, 10],
• the algorithm starts by selecting the largest coin value that is less than
or equal to the target amount. In this case, the largest coin is 18,
• so the algorithm selects one 18 coin. After subtracting 18 from 20,
the remaining amount is 2.
• At this point, the greedy algorithm chooses the next largest coin less
than or equal to 2, which is 1.
• It then selects two 1 coins to make up the remaining amount. So, the
greedy approach results in using one 18 coin and two 1 coins.
• Although it uses three coins, a better solution would have been to
use two 10 coins, resulting in a total of only two coins (10 + 10 = 20).
Example
• The Greedy Algorithm works well in problems
where:
✅ Making the best choice locally leads to the
best global solution.
✅ There are sorting or priority-based
conditions (e.g., selecting the highest, lowest,
or shortest).
✅ The decisions are final (once taken, they are
not changed later).
Feasible-optimal ?
• Suppose there is a problem 'P'. I want to travel
from A to B shown as below : P : A → B
• constraint - 6hr and min cost

Constraint

Train is the optimal sol which take 6 hrs n min cost Feasible solutions
Feasible-optimal ?
• If we say that we have to cover the journey at the minimum cost.
• This means that we have to travel this distance as minimum as
possible, so this problem is known as a minimization problem.
• Till now, we have two feasible solutions, i.e., one by train and
another one by air.
• Since travelling by train will lead to the minimum cost so it is an
optimal solution.
• An optimal solution is also the feasible solution, but providing the
best result so that solution is the optimal solution with the
minimum cost.
• There would be only one optimal solution.
General approach
Example of Greedy method
fractional knapsack problem using greedy
method
• Problem Statement: The weight
of N items and their corresponding
profits are given.
• We have to put these items in a
knapsack of weight W such that
the total profit obtained
is maximized.
• Fill in the knapsack with a subset
of items such that the selected
weight is less than or equal to the
capacity of the knapsack and the
profit of items is maximum.
• we can take fractions of items.
fractional knapsack problem using greedy
method
• There are basically three approaches to solve the
problem:
• The first approach is to select the item based on the
maximum profit.
• The second approach is to select the item based on
the minimum weight.
• The third approach is to calculate the ratio of
profit/weight.
• Among these we have to choose optimal one i.e last
one.
Example

Object 1 2 3
Profit 25 24 15
Weight 18 15 10

M=20 maximum weight


• 1. We will greedy about profit ,so we choose that
object which gives max profit
Object 1 2 3
Profit 25 24 15
Weight 18 15 10 BAG
Xi 1 2/15 0 2/15 (obj2) 2
20
18 (obj1) 18
• Here fraction Xi = remaining bag
capacity/total weight :--2/15
• Calculate profit = ∑ Xi*profit Object 1 2 3
Profit 25 24 15
• = 1*25 + (2/15)*24 + 0*15 Weight 18 15 10

• = 25 + 48/15 = 25+3.2=28.2 Xi 0 10/15 1

• 2) we will greedy about min weight

BAG •Calculate fraction profit =


10 Xi * profit
10/15 (obj 2) 20
10 (obj3) 10 • 15 + (10/15)*24
•= 15+16 =31
• 3. We greedy about item :
calculate the ratio of profit/weight.
BAG
Object 1 2 3
Profit 25 24 15 5
5/10 (obj3)
Weight 18 15 10 20
P/W 1.4 1.6 1.5 15
15 (obj2)
Xi 0 1 5/15

• Calculate fraction profit = Xi * profit


• = 24 + (5/10) * 15
•= 24 + 7.5
•= 31.5
• Greedy Approach
• Calculate profit-to-Weight Ratio: Compute
p/w for each item.
• Sort Items: Sort items in decreasing order of
p/w.
• Pick Items Greedily:
– Add the highest p/w item fully if possible.
– If not, take the fraction that fits in the remaining
weight , remainingweight / totalweight
• Continue Until Knapsack is Full.
Algorithm
Greedy Knapsack Algorithm:
1. GreedyFractionalKnapsack(items, n, capacity):

// Each item has: value[i], weight[i]

// Step 1: Compute value/weight ratio for each item


for i = 1 to n: ---Time : O(n)
Compute P[i] / W[i]

// Step 2: Sort items in descending order of ratio


sort items by ratio[i] in decreasing order --O(nlogn)

totalProfit = 0 // maximum profit


remainingCapacity = capacity
Algorithm
// Step 3: Pick items one by one
for i = 1 to n: ----- O(n)
if weight[i] <= remainingCapacity:
// Take the whole item
totalProfit = totalProfit + Profit[i]
remainingCapacity = remainingCapacity - weight[i]
else:
// Take fraction of the item
fraction = remainingCapacity / weight[i]
totalProfit = totalProfit + (value[i] * fraction)
remainingCapacity = 0
break // knapsack is full

return totalProfit Adding up the dominant terms:


O(n)+O(nlogn)+O(n) = O(nlogn)
Final Time Complexity: O(nlog⁡n)
Example
• Objects: 1 2 3 4 5 6 7
• Profit (P): 10 5 15 7 6 18 3
• Weight(w): 2 3 5 7 1 4 1
• M (Weight of the knapsack): 15
• n (no of items): 7
Step 1: Calculate P/W and Sort in Non-
Increasing Order
After sorting based on P/W ratio:
Step 2: Greedy Selection

• Knapsack Capacity = 15
Huffman Coding
Huffman Coding
• Huffman Coding is a greedy algorithm used
for lossless data compression.
• It is used to reduce the number of bits needed
to store or transmit data.
• It was first developed by David Huffman.
• Data can be encoded efficiently using Huffman
Codes.
Huffman Coding
• It is a widely used and beneficial technique for
compressing data , in which there are
frequently occurring characters.
• It assigns variable-length codes to input
characters based on their frequency,
• It assigns shorter codes to more frequent
characters and longer codes to less frequent
characters.
• Suppose we have 15 characters in a data file.
• Normal Storage: 8 bits per character (ASCII) -
8 x 15 =120 bits are required to send this
string.
• Char ASCII Binary
A 65 01000001

B 66 01000010

C 67 01000011

D 68 01000100
• But we want to compress the file and save it
compactly.

• We can use----
• Fixed length coding
• Variable length coding
• (i) Fixed length Code: Each letter represented by an equal
number of bits. With a fixed length code, 2 bits per character:
• A---- 00
• B ---- 01
• C ---- 10
• D ---- 11
1. Data Encoding:
• With 4 characters (A, B, C, D), a fixed-length code requires 2
bits –2*4=8
• For 15 characters, the encoded data uses:
15 characters × 2 bits = 30 bits
• For character : 4*8=32 bits
• Total = 8+ 30+32=70bits
• Problem: If one character occurs very often (say `a` occurs 50%
of the time), we are still wasting 3 bits every time.
• ii) Huffman coding (variable length Code ):
1. Count frequency of characters.
2. Build a binary tree where low-frequency
characters are deeper and high-frequency
characters are closer to the root.
3. Assign `0` to left edge and `1` to right edge.
4. Traverse tree to generate variable-length
codes
• Once the data is encoded, it has to be
decoded. Decoding is done using the same
tree.
• Huffman coding follows Greedy approach:

• The greedy method works by making the locally optimal choice


at each step, with the hope of reaching the global optimum.

In Huffman coding:

• At every step, we pick the two smallest frequency nodes


(locally optimal choice).
• Combine them into a new node whose frequency is the sum.
• Repeat until only one tree remains (global optimum).

• This greedy choice ensures that characters with higher


frequency get shorter codes.
• Steps for Huffman coding---
1. Count frequency of characters.
2. Insert the characters and their frequencies in
in a priority queue Q(min-heap), which keeps
them ordered automatically

3. Construct the Huffman Tree


• Make each unique character as a leaf node.
4. Create a new node `z` with frequency = sum
of those two.
• Set the value of the z as the sum of the above
two minimum frequencies.
(EXTRACT-MIN twice(get 2 smallest),create a
new node and insert back)

Getting the sum of the least numbers


5. Remove these two minimum frequencies
from Q and add the sum into the list of
frequencies (* denote the internal nodes in
the figure).

[Link] node z into the tree.


7. Repeat steps 3 to 5 for all the characters
8. For each non-leaf node, assign 0 to the left
edge and 1 to the right edge.
• Encode each char as
• A– 11
• B– 100
• C– 0
• D– 101
For sending the above string over a network, we need
to send:

1. The compressed message bits (encoded string).


2. The codebook (mapping of characters to Huffman
codes), so the receiver can decode.

• Here We do not need to send frequencies.

•The total size is given by the table below:


• 4 characters need to be identified → assume 8 bits each
→ `4 × 8 = 32 bits`.
• Codes themselves (to tell receiver how many bits/code)
→ 9 bits (as per given).
So, codebook size = 32 + 9 = 41 bits.
% Savings = (Savings ÷
Original size) × 100
= (51 ÷ 120) × 100
≈ 42.5 %
Decoding the code
• For decoding the code, we can take the code
and traverse through the tree to find the
character.
• Let 101 is to be decoded, we can traverse from
the root as in the figure below.
Huffman Coding Algorithm
Huffman_Coding(characters, frequencies):
Input:
characters[] → array of characters
frequencies[] → array of corresponding frequencies
Output:
Huffman tree and codes for characters

1. Create a priority queue Q (min-heap) of nodes


2. For each character c in characters:
create a node n with frequency = freq(c)
insert n into Q
Huffman Coding Algorithm
3. while size(Q) > 1:
left = extract_min(Q) // least frequency node
right = extract_min(Q) // second least frequency
node
create a new node parent
[Link] = [Link] + [Link]
[Link] = left
[Link] = right

insert parent into Q


Huffman Coding Algorithm
4. root = extract_min(Q) // final node is the root of
Huffman tree

5. Traverse the Huffman tree:


assign '0' to left edge, '1' to right edge
store the code for each character

6. return codes for all characters


Huffman Coding complexity analysis
Example

Char Frequency
A 5
B 9
C 12
D 13
E 16
F 45

Encoded String for "ABCD“


A -> 1100
B -> 1101
C -> 100
D -> 101
Encoded Output:
11001101100101
the Huffman Code Prefix-Free
• Huffman codes are always prefix-free by design.
• Prefix-free means: no code is the starting part of
another code.
• This property ensures that an encoded message can
be uniquely decoded without ambiguity.
• This ensures that the encoded message is uniquely
decodable.
the Huffman Code Prefix-Free
• Justification---
• In Huffman coding, every symbol is stored at a
leaf node of the tree.
• When we assign codes by moving left (0) or
right (1), we only stop at the leaves.
• Since all symbols are at leaves, no symbol’s
code can ever be the beginning of another’s
code.
Job scheduling/sequencing using
greedy approach
Job Sequencing Problem
• The Job Sequencing Problem is a classic
optimization problem in Greedy Algorithm
• where we aim to schedule a set of jobs in such
a way that we maximize the total profit
• while ensuring that each job is completed
within its deadline.
• Problem Statement
• Given N jobs, where each job has:
• Job ID – Unique identifier for each job.
• Deadline – The latest time by which the job
must be completed.
• Profit – The profit earned if the job is
completed within its deadline.
• Each job takes exactly 1 unit of time to
complete.
• The objective is to schedule the jobs in a way
that maximizes the total profit.
Constraints
• Each job takes exactly 1 unit of time to complete.
• Each job has a fixed deadline, meaning it must be
completed before or on its deadline.
• Each job has a profit, which is earned only if the
job is completed on time.
• Jobs cannot be preempted, meaning once a job
starts, it must complete before another job can
begin.
• A single job can be scheduled per time slot.
• Uni-processor is there for job execution .
Algorithm

• Input:
• An array of N jobs, each with a Job ID, Deadline, and Profit.
• Output:
The sequence of scheduled jobs.
The maximum profit obtained.
• Steps:
1. Sort all jobs in descending order of profit.
2. Find the maximum deadline among all jobs.
3. Create an array of time slots (size = maximum deadline).
4. Selects the jobs with highest profits and place it in the latest
available slot before its deadline
5. If a free slot is found, schedule the job.
6. If no slot is available, skip the job.
7. Compute the total profit from the scheduled jobs.
Examples
• Consider the following tasks with their
deadlines and profits. Schedule the tasks in
such a way that they produce maximum profit
after being executed −
S. No. 1 2 3 4 5
Jobs J1 J2 J3 J4 J5
Deadline 2 2 1 3 4
s
Profits 20 60 40 100 80
• Step 1 : Arrange the jobs in descending order of their
profits.

S. No. 1 2 3 4 5
Jobs J4 J5 J2 J3 J1
Deadlin 3 4 2 1 2
es
Profits 100 80 60 40 20

• Step2 : Find the maximum deadline value, dm, from the


deadlines given.
– max deadline = 4
we create a time slot array of size 4.
S. No. 1 2 3 4 5
• Step 3: Schedule Jobs Jobs J4 J5 J2 J3 J1

• Choose the job with Deadli


nes
3 4 2 1 2

highest profit, J4. Profits 100 80 60 40 20

• Deadline 3
• Total Profit = 100.
0 1 2 3 4
J3 J2 J4 J5

Job consider Slot assign Solution Profit


J4 [2,3] slot3 J4 100
J5 [3,4]slot4 J4,J5 100+80=180
J2 [1,2] slot2 J2,J4,J5 180+60=240
J3 [0,1]slot1 J3, J2,J4,J5 240+40=280
J1 Rejected

Final Job Order: J3 → J2 → J4 → J5


Total Profit: 40 + 60 + 100 + 80 = 280
Example 2
S. No. 1 2 3 4 5
Jobs J1 J2 J3 J4 J5
Deadline 2 2 1 3 3
s
Profits 20 15 10 5 1
After sorting : Max Deadline =3
S. No. 1 2 3 4 5
Jobs J1 J2 J3 J4 J5
Deadline 2 2 1 3 3
s
Profits 20 15 10 5 1
S. No. 1 2 3 4 5

Jobs J1 J2 J3 J4 J5
Deadlines 2 2 1 3 3
Profits 20 15 10 5 1

0 1 2 3
J2 J1 J4

Job consider Slot assigned Solution Profit


J1 [1,2] J1 20
J2 [0,1] J2,J1 20+15 =35
J3 Rejected
J4 [2,3] J3,J2,J1 35+5=40
J5 Rejected
Total Profit =40 and Job sequence - J3,J2,J1
Example 3
Jobs J1 J2 J3 J4 J5 J6 J7
Deadlin
3 4 4 2 3 1 2
es
Profits 35 30 25 20 15 12 5
After sorting : Max Deadline =4

Jobs J1 J2 J3 J4 J5 J6 J7
Deadlin
3 4 4 2 3 1 2
es

Profits 35 30 25 20 15 12 5
Jobs J1 J2 J3 J4 J5 J6 J7
Deadlines 3 4 4 2 3 1 2
Profits 35 30 25 20 15 12 5

0 1 2 3 4
J4 J3 J1 J2

Job consider Slot assigned Solution Profit


J1 [2,3] J1 35
J2 [3,4] J1,J2 35+30=65
J3 [1,2] J3,J1,J2 65+25=90
J4 [0,1] J4, J3,J1,J2 90+20=110
J5 Rejected
J6 Rejected
J7 Rejected

Total Profit =110 and Job sequence - J4, J3,J1,J2


Time Complexity

• Step 1: Sort Jobs by Profit in Descending Order-Time


Complexity: O(N log N)
• Step 2: Find Maximum Deadline-Time Complexity: O(N)
• Step 3: Create an Array for Time Slots-Time Complexity: O(N)
• Step 4: Place Jobs in Latest Available Slot Before Deadline- Slots-
Time Complexity: O(N2)
• Final Time Complexity: O(N2) in worst case
in Best case O(N logN)
Program for worst case scenario:
import [Link]; for (Job job : jobs)
class Job { maxDeadline = [Link](maxDeadline,
[Link]);
int id, profit, deadline;
return maxDeadline;
Job(int id, int profit, int deadline) { }
[Link] = id; // Function to perform Job Sequencing (Worst
[Link] = profit; Case)
[Link] = deadline; static void jobSequencing(Job[] jobs, int n) {
}} // Step 1: Sort jobs by profit (Descending)
sortJobsByProfit(jobs);
public class JobSequencingWorstCase {
// Function to sort jobs by profit in // Step 2: Find max deadline
descending order int maxDeadline = getMaxDeadline(jobs);
static void sortJobsByProfit(Job[] int[] slots = new int[maxDeadline]; // Slot
jobs) { array
[Link](jobs, (a, b) -> [Link] - [Link](slots, -1); // Initialize slots as
[Link]); } empty
// Function to find the maximum
deadline among all jobs int totalProfit = 0;
static int getMaxDeadline(Job[] jobs) int[] jobSlot = new int[n]; // Store
{ int maxDeadline = 0; allocated slot for each job
// Step 3: Assign jobs (Worst Case -
Scan all slots) [Link]("Maximum Profit: " +
for (int i = 0; i < n; i++) { totalProfit);
for (int j = jobs[i].deadline - 1; j }
>= 0; j--) // Driver Code (Worst Case Scenario)
{ // Scan from deadline down public static void main(String[] args) {
if (slots[j] == -1) { // If slot is empty int n = 7; // Number of jobs (Increase N for
slots[j] = jobs[i].id; testing worst case)
totalProfit += jobs[i].profit; // Worst case: All jobs have the same
jobSlot[jobs[i].id - 1] = j + 1; large deadline (N)
// Store allocated slot (1-based index) Job[] jobs = {
break; new Job(1, 50, 7),
}} } new Job(2, 40, 7),
// Output scheduled jobs with new Job(3, 30, 7),
allocated slots new Job(4, 20, 7),
[Link]("Job ID | new Job(5, 10, 7),
Allocated Slot"); new Job(6, 5, 7),
for (int i = 0; i < n; i++) { new Job(7, 1, 7)
if (jobSlot[i] != 0) };
[Link]("J" + (i + 1) jobSequencing(jobs, n);
+ " | Slot " + jobSlot[i]); }}
}
output
• Job ID | Allocated Slot
• J1 | Slot 7
• J2 | Slot 6
• J3 | Slot 5
• J4 | Slot 4
• J5 | Slot 3
• J6 | Slot 2
• J7 | Slot 1
• Maximum Profit: 156
minimum spanning tree-
Kruskal’s and Prim’s algorithm
Minimum Spanning Tree (MST) Using
Greedy Approach
• Minimum Spanning Tree (MST) is a subset of
edges in a connected, weighted, and undirected
graph that:
– Connects all vertices without forming cycles.
– Has minimum total edge weight among all spanning
trees.
• Why is it called Greedy Algorithm?
• It is called greedy because it chooses the optimal
solution present at the moment and not the
optimal solution as a whole.
How to find cost of Minimum Spanning Tree?

• The cost of a spanning tree is the sum of the


weights of its edges.
• A Minimum Spanning Tree (MST) is the
spanning tree with the lowest cost among all
possible spanning trees.
• There can be multiple MSTs for a graph.
• All vertices are connected without cycles.
• For n vertices, the MST has n-1 edges.
• Algorithms to Find MST:
• Prim’s Algorithm – Greedy approach, builds MST
one edge at a time.
• Kruskal’s Algorithm – Sorts edges by weight and
picks the smallest while avoiding cycles.
Applications of MST:
• Network Design (Internet, roads, electrical
circuits).
• Clustering in Machine Learning.
• Approximation Algorithms for NP-Hard Problems.
• MST helps in efficiently connecting nodes with
minimum cost.
Prim’s Algorithm
Prim’s Algorithm

What is Prim’s Algorithm?


• Prim’s Algorithm is a Greedy Algorithm used to
find the Minimum Spanning Tree (MST) of a
connected, weighted, undirected graph.
How Prim’s Algorithm Works?
• Start with any vertex as the initial MST.
• Select the smallest edge that connects a new
vertex to the MST.
• Repeat until all vertices are included in the
MST.
Prim’s Algorithm
Properties of Prim’s Algorithm
• Always maintains a connected MST while adding edges.
• Uses Priority Queue (Min Heap) for efficiency.
Why Prim’s is a Greedy Algorithm?
• At each step, it chooses the smallest edge that expands the
MST.
• The local optimal choice leads to a globally optimal MST.
• Prim’s Algorithm is used for network design, clustering, and
efficient connectivity.
Example of prim's algorithm
• Step 1 - First, we have to choose a vertex from
the above graph. Let's choose B.
• Step 2 - Now, we have to choose and add the
shortest edge from vertex B.
• There are two edges from vertex B that are B
to C with weight 10 and edge B to D with
weight 4.
• Among the edges, the edge BD has the
minimum weight. So, add it to the MST.
• Step 3 - Now, again, choose the edge with the
minimum weight among all the other edges.
• In this case, the edges DE and CD are such
edges.
• Add them to MST and explore the adjacent of
C, i.e., E and A.
• So, select the edge DE and add it to the MST.
• Step 4 - Now, select the edge CD, and add it to
the MST.
• Step 5 - Now, choose the edge CA. Here, we cannot
select the edge CE as it would create a cycle to the
graph. So, choose the edge CA and add it to the MST

• So, the graph produced in step 5 is the minimum


spanning tree of the given graph. The cost of the MST
is given below -
• Cost of MST = 4 + 2 + 1 + 3 = 10 units.
Prim’s Algorithm
A = ∅; //A stores the edges of the MST (initially empty).
U = {1}; //U starts with a single vertex (let’s say vertex 1).
while (U ≠ V) //Loop Until All Vertices are in U
let (u, v) be the least weighted edge such that //Find
the smallest edge (u, v) where
v ∈ V - U and u ∈ U; //u is in U (already in MST). And v
is in V - U (not yet in MST).
A = A ∪ {(u, v)} //Add the edge (u, v) to MST (A).
U = U ∪ {v} //add vertex v to U (mark v as included in MST).
//Repeat Until U Contains All Vertices (U = V)
Example 2
Time complexity of Prim’s algorithm

• The time complexity of Prim’s algorithm depends


on the data structures used for the priority queue
(min-heap) and adjacency representation.
• Best and Worst Case Time Complexity
• 1. Using an Adjacency Matrix + Simple Min
Selection (O(V²))
– Each vertex is checked for the minimum edge, leading to
O(V²) complexity.
– Best Case: O(V²)
– Worst Case: O(V²)
• Complexity O(E log V) for Prim’s Algorithm Using a Min-Heap
and Adjacency List
1. Extract-Min Operation (O(log V))
• The algorithm picks the minimum-weight edge V times from a
priority queue (min-heap).
• Each extraction takes O(log V).
• Total contribution: O(V log V).
2. Edge Relaxation(short distance found) (O(log V) per edge
update)
• Every vertex is connected through E edges in total.
• Each edge update in the priority queue (decrease-key
operation) takes O(log V).
3. Total contribution: O(E log V).
• Final Complexity: O(V log V + E log V) = O(E log V) (since E ≥ V
in a connected graph).
Kruskal's Algorithm
Kruskal's algorithm
• Kruskal's algorithm is a
minimum spanning tree algorithm that takes a
graph as input and finds the subset of the
edges of that graph which
• form a tree that includes every vertex
• has the minimum sum of weights among all
the trees that can be formed from the graph
How Kruskal's algorithm works

• It falls under a class of algorithms called


greedy algorithms that find the local optimum
in the hopes of finding a global optimum.
• We start from the edges with the lowest
weight and keep adding edges until we reach
our goal.
How Kruskal's algorithm works

• The steps for implementing Kruskal's


algorithm are as follows:
• Sort all the edges from low weight to high
• Take the edge with the lowest weight and add
it to the spanning tree. If adding the edge
created a cycle, then reject this edge.
• Keep adding edges until we reach all vertices.

You might also like