Lecture 17, March 8
Download
Report
Transcript Lecture 17, March 8
Lecture 17 Path Algebra
Matrix multiplication of adjacency matrices of directed
graphs give important information about the graphs.
Manipulating these matrices to study graphs is path
algebra.
With "path algebra" we can solve the following problems:
Compute the total number of paths between all pairs
of vertices in a directed acyclic graph;
Solve the "all-pairs-shortest-paths" problem in a
weighted directed graph with no negative cycles;
Compute the "transitive" closure of a directed graph.
Think: what is the meaning of M2 where M is an
adjacency matrix of a graph G?
All paths of length r
Claim. Mr describes paths of length r in G, with entry
i,j denoting number of distinct paths from i to j, here,
M is the adjacency matrix of G.
Proof. The base case is r = 0, trivial. For the induction
step, assume the claim is true for r' < r; we prove it
for r. Then Mrij = (M · Mr-1)ij
= ∑1 ≤ k ≤ n Mik Mr-1kj
Now any r-step path from i to j must start with a step
to some intermediate vertex k. If there is such a path,
Mik is 1; otherwise it is 0. So adding up Mik Mr-1kj gives
the number of r-step paths.
Computing the matrix powers
Suppose G is acyclic (no path longer than n-1).
Consider I + M + M2 + ... + Mn-1 : the ij'th entry gives
the total number of distinct paths from i to j.
Therefore, in O(nω+1) steps, we can compute the total
number of distinct paths between all pairs of vertices.
Here ω denotes the best-known exponent for matrix
multiplication; currently ω = 2.376.
We can even do better, if n=2k, by first calculating M2,
M4, ..., M2^(k-1), and then calculating
(I+M)(I+M2)(I+M4)...(I+M2^(k-1)) =
I + M + M2 + ... + M2^k–1, where 2k is the least power
of 2 that is ≥ n. This gives an algorithm that runs in
O(nω log n) time, where n is the number of vertices.
Reachability graph
Given an un-weighted directed graph G = (V,E), and
we want to form the graph G' that has an edge
between u and v if and only if there exists a path (of
any length) in G from u to v.
Let's first see how to solve it using what we know
from say CS 240. There, we explored depth-first and
breadth-first search algorithms. These algorithms
could find all vertices reachable from a given vertex
in O(|V|+|E|) time. So if we run depth-first search from
every vertex, the total time is O(n(|V|+|E|), which
could be as bad as O(n3) if the graph is dense. Can
we do better than O(n3)?
Transitive closure
Back to path algebra. Now our matrix M consists of
1's and 0's. How can we find the matrix where there's
a 1 in row i and column j iff there is a length-2 path
connecting vertex i and j? I claim the entry in row i
and column j should be
OR1 ≤ k ≤ n Mik AND Mkj.
That is: we just need to use “boolean multiplication
and additions in our matrix computation.
So the transitive closure of G is given by
M' = I + M + M2 + ... + Mn-1
where +, * are corresponding boolean operations. It
tells if there is a path between any two nodes.
Transitive closure continues ..
How fast can we compute:
I + M + M2 + ... + Mn-1 ?
we can multiply 2 Boolean matrices in O(nω)
steps.
Since, using the “doubling trick”, we have to
do log n Boolean matrix multiplications, this
gives a total cost of O(nω log n) to solve the
transitive closure problem. This is indeed
better than simply running breadth-first or
depth-first search from each vertex.
Think: what if n is not a power of 2?