Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
11 views28 pages

Shortest Path

The document discusses shortest path algorithms, including single source shortest path (SSSP) problems. It describes Ford's algorithm as a general framework and discusses specific implementations like BFS, DFS, Dijkstra's algorithm, and Bellman-Ford algorithm. Complexity analyses are provided for each algorithm.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views28 pages

Shortest Path

The document discusses shortest path algorithms, including single source shortest path (SSSP) problems. It describes Ford's algorithm as a general framework and discusses specific implementations like BFS, DFS, Dijkstra's algorithm, and Bellman-Ford algorithm. Complexity analyses are provided for each algorithm.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Shortest Path

References:
1. Algorithms, Jeff Erickson, Chapter 8
2. Algorithm Design Manual, Skiener, Chapter 6

Problem: Given a directed weighted graph G = (V ; E ; w) with two


special nodes s ; t, what is the shortest path from s to t? (the length
of a path is the sum of the all the weights of the edges on the path).
Shortest Path Trees 2/8

To determine the shortest path from s to t, almost all algorithms


actually computes more general single source shortest path
(SSSP) problem:
SSSP: find the shortest path from s to every other nodes.
This problem is solved by finding a shortest path tree, rooted at
s, which contains all desired shortest paths.
Why do all shortest paths constitute a tree?
1. If all shortest paths are unique, then the union of shortest paths
is a tree (recall: unique paths towards a root node ! tree)
2. If there are multiple shortest paths to t, we can pick and choose
to make the union a tree. For example:
If s->u (solid) and s->v (dashed) are two shortest paths, the paths
a->d through solid edges and dashed edges must both be shortest,
and we can simply include the solid edges in our tree.
Differences between Minimium Spanning Tree (MST)
and Shortest Path Tree (SPT):
1. MST for undirected graph; SPT for directed graph (can be
adapted to undirected too with some caveats)
2. MST does not have a root; SPT does.
3. MST can be unique; SPT are distinct for different root.
Negative Edges 3/8

Negative edges is a pain for SSSP.


If a cycle is negative, then shortest path may not be well-defined!

For example, what is the shortest path from s to t?


Negative edges also make our algorithms for undirected graph
complicated. The union of all shortest paths need not be a tree.
We don't tree undirected graph with negative edges here.
The Only SSSP Algorithm 4/8

Just like graph traversal and MST algorithms, there is a meta SSSP
algorithm:

Each node v maintains two values which describes a tentative


shortest path from source s to v:

 dist(v): the length of the tentative shortest path; 1 if no path


exist.

 pred(v): the predecessor of v in the tentative shortest path.


NULL if no path exist.

The pred(v) defines a tree rooted at source s, like the parent(v) in


WhateverFirstSearch().
The SSSP algorithm:
 Initialize dist(v) 1 and pred(v) NULL for v =
/ s.
Initialize dist(s) 0; pred(v) NULL.
 During the execution of algorithm, an edge u ! v is tense, if

dist(u) + w(u ! v) < dist(v)

For tense edge u ! v, the tentative shortest path stored in v is


clearly incorrect (overestimtation): we already find a better route
through u ! v.
 We can correct the overestimate by relaxing the edges:

Relax(u ! v):
dist(v) dist(u) + w(u ! v)
pred(v) u
Now we have all the pieces, the SSSP algorithm can be described:
Repeatedly relax tense edges, until no more tense
edges.

After the algorithm terminates (no tense edges exist), all dist(v) will
be correct, and pred(v) will define a SPT.
Some subtleties:
 If dist(v) = 1, then v is not reachable from source s.
 If any negative cycle is reachable from s, then the algorithm will
never terminate.
Why is FordSSSP() algorithm correct?
1. At any moment, for every node v, the dist(v) is either 1 or the
length of a walk from s to v.
2. If the graph has no negative cycles, then the dist(v) is either 1
or the length of a simple path from s to v. This implies that,
without negative cycles, the algorithm terminates in finite steps.
3. If no edge is tense, then dist(v) is the length of path s !    !
pred(pred(v)) ! pred(v) ! v.
Futhermore, If v violates this condition, but pred(v) does not,
then pred(v) ! v is tense.
4. If no edge is tense, then s !    ! pred(pred(v)) ! pred(v) ! v
is indeed the shortest path to v.
Furthermore, If v violates this condition, but pred(v) does not,
then pred(v) ! v is tense.
Exactly how to find the tense edges, and which edges to relax in
what order is not being said here.
There are serveral variants, whose efficiency and correctness depends
on the structure of the input graph.
We are going to look at 4 instantiations of Ford's algorithm. Although
we can use the general FordSSSP() proof of correctness to argue
the instantiation correctness, we may prove in their own context
for concreteness.
Algo #1: BFS for unweighted graph 5/8

In the simplest special SSSP, the edge is unweighted, or equivalently,


all edges have the same weight 1. In this case, BFS finds the SSSP
solution in O(V + E ) time:
Algo #2: DFS for DAG 6/8

SSSP for Directed Acyclic Graph (DAG) is also easy to compute,


even if some edges have negative weights.
Since DAG has no cycle, SSSP is always well defined.
The dist(v) satisfies the obvious recurrence:

In fact this recurrence holds not only for DAG, but for all graphs!
However, only for DAG this recurrence is computable. (Why? Because
if there is cycle, then the recursion never terminates).
For DAG, this recurrence actually lead to dynamic programming.
In fact, this dynamic programming algorithm can be viewed as Ford's
algorithm instantiation:
We can also visit all outgoing edges instead of incoming edges.
Algo #3: Best-First: Dijkstra's 7/8

If we modify BFS by using Priority-Queue instead of FIFO queue,


we arrive at Dijkstra's algorithm:
Dijkstra's algorithm is an instantiation of Ford's, so it correctly finds
SSSP provided that there's no negative cycle.

To analyze performance, we divide into two cases:

Case 1: No Negative Edges: Dijkstra works great! O(E log V ).

Dijkstra's algorithm works like BFS in a wavefront way, expanding


the front like:
what is the time complexity? It's not obvious how many times a
node can be put into the priority queue.
To analyze performance, we define:
 ui denotes the node returned by the i-th ExtractMin() returns.
 di denotes dist(ui ) immediately after the i-th ExtractMin().
ui ; i = 1: : : need not be distinct.
In particular, we have u1 = s ; d1 = 0.
And we would like to claims that each node is ExtractMin()
at most once, therefore the Dijkstra algorithm runs in O(E log V )
time, if the priority queue is implemented with binary heap.

Lemma 1. If G has no negative-weight edges, then for all i<j, we


have di 6 dj.

Proof: fix index i; we'd like to show di 6 di +1. Two cases:


1. If ui ! ui +1 is an edge, and this edge is tense during the i-
th iteration, then dist(ui +1) = dist(ui ) + w(ui ! ui +1) > dist(ui )
because the non-negative edge weight.
2. Otherwise, ui +1 must be already in the queue, and dist(ui +1) >
dist(ui ) because we retrieve the min dist() which is dist(ui ):


Lemma 2. If G has no negative-weight edges, then each node of


G is Extracted from the priority queue at most once.

Proof: By contradiction. Suppose v is extracted more than once.


Suppose v is extracted at i-th iteration, and re-inserted at j-th iter-
ation, and re-extracted at k-th iteration, for some indices i < j < k,
then v = ui = uk .
The dist(ui ) = dist(v) must decrease strictly in j-th iteration, just
before v is re-inserted. Therefore dist(uk ) < dist(ui ), which contra-
dicts uk = ui = v. 
What does Lemma 2 tells us?

1. Each node is extracted at most once, thus the time complexity


of the algorithm is O(E log V ).

2. The first time a node is put into the priority queue, its dist(.)
is 1. Later it may get DecreaseKey() several times; after it's
extracted, its dist(.) never changes again.
Case #2: negative edges: Dijkstra may be slow!
If there's negative edges, then the same node may be
inserted/extracted multiple times!

For the above graph, Dijkstra's algorithm must do (2V /2) relax-
ations. The worst case is actually (2V ).
But in practice, Dijkstra's seems to work fast even with negative
edges.
Algo #4: Bellman-Ford: Relax all edges 8/8

The most general and simplest to describe:

What's the time complexity?


Define: for node v and integer i, let dist6i (v ) denote the length
of the shortest walk in G from s to v, consisting of at most i
edges. In particular, dist60(s) = 0 and dist60(v) = 1 for v =
/ s.

Lemma 3. For every node v and i > 0, after i iteration of main


loop in BellmanFord(), dist(v) 6 dist6i (v).

Proof: induction on i. Suppose dist(u) 6 dist6i ¡1(u) for all u after


(i ¡ 1)-th iteration. We prove dist(v) 6 dist6i (v) for all v after i-
th iteration.
Suppose W is a shortest walk from s to v, with at most i edges.
Let u ! v be the last edge in W.
During the i-th iteration in BellmanFord(), when we consider edge
u ! v , we make sure that dist(v ) 6 dist6i ¡1(u) + w (u ! v ) =
dist6i (v). 
What does this lemma tell us?

1. If the graph has no negative cycles, the shortest walk from s to


any node is a simple path, with at most V ¡ 1 edges. Therefore
Bellman-Ford() must terminate in at most V ¡ 1 iterations.

2. Conversely, if any edge is still tense after V ¡ 1 iterations, then


the input has negative cycle! We have a simple negative cycle
detector built-in Bellman-Ford().

Therefore we can re-write the loop:


The time complexity is O(VE ), regardless of the presence of nega-
tive edges, or negative cycles.

For non-negative edge graph, Dijkstra's is faster.


Another way to derive the Bellman-Ford algorithm is through dynamic
programming. First we have the recurrence: (we've seen this in DAG
DFS solution)

However for non-DAG graph, this recursion might not terminate!

To make it terminate, we add restriction to the definition of dist():

dist6i (v) denote the length of the shortest walk from s to v con-
sisting of at most i edges. Then we have recurrence:

You might also like