Open Attachment

Download Report

Transcript Open Attachment

A B-tree of order m is a multiway search tree of order m such that:
Note that ceil(x) is the so-called ceiling function. It's value is the smallest integer that is
greater than or equal to x. Thus ceil(3) = 3, ceil(3.35) = 4, ceil(1.98) = 2, ceil(5.01) = 6,
ceil(7) = 7, etc.
A B-tree is a fairly well-balanced tree by virtue of the fact that all leaf nodes must be at
the bottom. Condition (2) tries to keep the tree fairly bushy by insisting that each node
have at least half the maximum number of children. This causes the tree to "fan out" so
that the path from root to leaf is very short even in a tree that contains a lot of data.
All leaves are on the bottom level.
All internal nodes (except perhaps the root
node) have at least ceil(m / 2) (nonempty)
children.
The root node can have as few as 2
children if it is an internal node, and can
obviously have no children if the root node
is a leaf (that is, the whole tree consists
only of the root node).
Each leaf node (other than the root node if
it is a leaf) must contain at least ceil(m / 2)
- 1 keys.
Let's work our way through an example similar to that given by
Kruse. Insert the following letters into what is originally an empty
B-tree of order 5: C N G A H E K Q M F W L T Z D P R X Y S
Order 5 means that a node can have a maximum of 5 children
and 4 keys. All nodes other than the root must have a minimum of
2 keys. The first 4 letters get inserted into the same node,
resulting in this picture:
When we try to insert the H, we find no room in this node, so we
split it into 2 nodes, moving the median item G up into a new root
node. Note that in practice we just leave the A and C in the current
node and place the H and N into a new node to the right of the old
one.
Inserting E, K, and Q proceeds without requiring any splits:
Inserting M requires a split. Note that M happens to be the median key and so is moved up into
the parent node.
The letters F, W, L, and T are then added without needing any
split.
When Z is added, the rightmost leaf must be split. The median item T is moved
up into the parent node. Note that by moving up the median key, the tree is kept
fairly balanced, with 2 keys in each of the resulting nodes.
The insertion of D causes the leftmost leaf to be split. D happens to be the
median key and so is the one moved up into the parent node. The letters P, R,
X, and Y are then added without any need of splitting:
Finally, when S is added, the node with N, P, Q, and R splits,
sending the median Q up to the parent. However, the parent node
is full, so it splits, sending the median M up to form a new root
node. Note how the 3 pointers from the old parent node stay in the
revised node that contains D and G.
Deleting an Item
In the B-tree as we left it at the end of the last section, delete H. Of course, we first do a
lookup to find H. Since H is in a leaf and the leaf has more than the minimum number of
keys, this is easy. We move the K over where the H had been and the L over where the K
had been. This gives
Next, delete the T. Since T is not in a leaf, we find its successor (the next item in ascending
order), which happens to be W, and move W up to replace the T. That way, what we really have
to do is to delete W from the leaf, which we already know how to do, since this leaf has extra
keys. In ALL cases we reduce deletion to a deletion in a leaf, by using this method
Next, delete R. Although R is in a leaf, this leaf does not have an extra key; the deletion results in a
node with only one key, which is not acceptable for a B-tree of order 5. If the sibling node to the
immediate left or right has an extra key, we can then borrow a key from the parent and move a key up
from this sibling. In our specific case, the sibling to the right has an extra key. So, the successor W of S
(the last key in the node where the deletion occurred), is moved down from the parent, and the X is
moved up. (Of course, the S is moved over so that the W can be inserted in its proper place.)
Finally, let's delete E. This one causes lots of problems. Although E is in a leaf, the leaf has no extra
keys, nor do the siblings to the immediate right or left. In such a case the leaf has to be combined
with one of these two siblings. This includes moving down the parent's key that was between those
of these two leaves. In our example, let's combine the leaf containing F with the leaf containing A C.
We also move down the D.
We begin by finding the immediate successor, which would be D,
and move the D up to replace the C. However, this leaves us with
a node with too few keys.
Topological Sort
• Introduction.
• Definition of Topological Sort.
• Topological Sort is Not Unique.
• Topological Sort Algorithm.
• An Example.
• Implementation.
• Review Questions.
Introduction
• There are many problems involving a set of tasks in which
some of the tasks must be done before others.
• For example, consider the problem of taking a course only
after taking its prerequisites.
• Is there any systematic way of linearly arranging the courses
in the order that they should be taken?
Yes! - Topological sort.
Definition of Topological Sort
• Topological sort is a method of arranging the vertices in a directed acyclic
graph (DAG), as a sequence, such that no vertex appear in the sequence
before its predecessor.
• The graph in (a) can be topologically sorted as in (b)
(a)
(b)
Topological Sort is not unique
• Topological sort is not unique.
• The following are all topological sort of the graph below:
s1 = {a, b, c, d, e, f, g, h, i}
s2 = {a, c, b, f, e, d, h, g, i}
s3 = {a, b, d, c, e, g, f, h, i}
s4 = {a, c, f, b, e, h, d, g, i}
etc.
Topological Sort Algorithm
•
One way to find a topological sort is to consider in-degrees of the vertices.
•
The first vertex must have in-degree zero -- every DAG must have at least one
vertex with in-degree zero.
•
The Topological sort algorithm is:
int topologicalOrderTraversal( ){
int numVisitedVertices = 0;
while(there are more vertices to be visited){
if(there is no vertex with in-degree 0)
break;
else{
select a vertex v that has in-degree 0;
visit v;
numVisitedVertices++;
delete v and all its emanating edges;
}
}
return numVisitedVertices;
}
Topological Sort Example
• Demonstrating Topological Sort.
1
2
3
0
2
A
B
C
D
E
F
G
H
I
J
1
0
2
2
0
D
G
A
B
F
H
J
E
I
C
Implementation of Topological Sort
• The algorithm is implemented as a traversal method that visits the
vertices in a topological sort order.
• An array of length |V| is used to record the in-degrees of the vertices.
Hence no need to remove vertices or edges.
• A priority queue is used to keep track of vertices with in-degree zero that
are not yet visited.
public int topologicalOrderTraversal(Visitor visitor){
int numVerticesVisited = 0;
int[] inDegree = new int[numberOfVertices];
for(int i = 0; i < numberOfVertices; i++)
inDegree[i] = 0;
Iterator p = getEdges();
while (p.hasNext()) {
Edge edge = (Edge) p.next();
Vertex to = edge.getToVertex();
inDegree[getIndex(to)]++;
}
Implementation of Topological Sort
BinaryHeap queue = new BinaryHeap(numberOfVertices);
p = getVertices();
while(p.hasNext()){
Vertex v = (Vertex)p.next();
if(inDegree[getIndex(v)] == 0)
queue.enqueue(v);
}
while(!queue.isEmpty() && !visitor.isDone()){
Vertex v = (Vertex)queue.dequeueMin();
visitor.visit(v);
numVerticesVisited++;
p = v.getSuccessors();
while (p.hasNext()){
Vertex to = (Vertex) p.next();
if(--inDegree[getIndex(to)] == 0)
queue.enqueue(to);
}
}
return numVerticesVisited;
}
Review Questions
1. List the order in which the nodes of the directed graph GB are visited by
topological order traversal that starts from vertex a.
2.
What kind of DAG has a unique topological sort?
3. Generate a directed graph using the required courses for your major. Now
apply topological sort on the directed graph you obtained.
What is a Graph?
• A graph G = (V,E) is composed of:
V: set of vertices
E: set of edges connecting the vertices in V
• An edge e = (u,v) is a pair of vertices
• Example:
a
b
E= {(a,b),(a,c),(a,d),
(b,e),(c,d),(c,e),
(d,e)}
c
d
V= {a,b,c,d,e}
e
Applications
CS16
• electronic circuits
• networks (roads, flights, communications)
JFK
LAX
HNL
STL
DFW
FTL
Terminology:
Adjacent and Incident
• If (v0, v1) is an edge in an undirected graph,
– v0 and v1 are adjacent
– The edge (v0, v1) is incident on vertices v0 and v1
• If <v0, v1> is an edge in a directed graph
– v0 is adjacent to v1, and v1 is adjacent from v0
– The edge <v0, v1> is incident on v0 and v1
Terminology:
Degree of a Vertex
The degree of a vertex is the number of edges
incident to that vertex
For directed graph,
the in-degree of a vertex v is the number of edges
that have v as the head
the out-degree of a vertex v is the number of edges
that have v as the tail
if di is the degree of a vertex i in a graph G with n vertices and e edges,
the number of edges is
e(
n 1
d ) / 2
i
0
Why? Since adjacent vertices each
count the adjoining edge, it will be
counted twice
Examples
0
3
2
1
0
2
3
3
1
2
3
3G1
3
3
4
1
1
0
in:1, out: 1
1
in: 1, out: 2
2
in: 1, out: 0
directed graph
in-degree
out-degree
G3
3
5
6
G2 1
1
Terminology:
Path
• path: sequence of
vertices v1,v2,. . .vk such
that consecutive vertices
vi and vi+1 are adjacent.
3
2
3
3
3
a
b
a
c
b
c
e d
d
abedc
e
bedc
28
More Terminology
• simple path: no repeated vertices
a
b
bec
c
e
d
• cycle: simple path, except that the last vertex is the same as the first
vertex
a
b
acda
c
d
e
Even More Terminology
•connected graph: any two vertices are connected by some path
connected
not connected
• subgraph: subset of vertices and edges forming a graph
• connected component: maximal connected subgraph. E.g., the graph below
has 3 connected components.
Subgraphs Examples
0
1
0
0
2
3
G1
0
1
1
(i)
G3
1
2
(iv)
3
0
0
0
1
1
1
2
2
2
(i)
3
0
(ii)
(iii)
(a) Some of the subgraph of G1
0
1
2
2
(ii)
(iii)
(iv)
(b) Some of the subgraph of G3
More…
• tree - connected graph without cycles
• forest - collection of trees
tree
tree
forest
tree
tree
Directed vs. Undirected Graph
• An undirected graph is one in which the pair
of vertices in a edge is unordered, (v0, v1) =
(v1,v0)
• A directed graph is one in which each edge is a
directed pair of vertices, <v0, v1> != <v1,v0>
tail
head
Graph Representations
Adjacency Matrix
Adjacency Lists
Adjacency Matrix
Let G=(V,E) be a graph with n vertices.
The adjacency matrix of G is a two-dimensional
n by n array, say adj_mat
If the edge (vi, vj) is in E(G), adj_mat[i][j]=1
If there is no such edge in E(G), adj_mat[i][j]=0
The adjacency matrix for an undirected graph is
symmetric; the adjacency matrix for a digraph
need not be symmetric
Examples for Adjacency Matrix
0
1
0
1

1

1
1
0
1
1
1
1
0
1
1
1 
1

0
G
1
2
6
3
0 1 0


1
0
1


0 0 0
G2
1
symmetric
undirected: n2/2
directed: n2
5
1
2
2
3
4
0
0
7
0
1

1

0
0

0
0

0
1 1
0 0
0 0
1 1
0 0
0 0
0 0
0 0
0 0 0 0 0
1 0 0 0 0
1 0 0 0 0

0 0 0 0 0
0 0 1 0 0

0 1 0 1 0
0 0 1 0 1

0 0 0 1 0
G4
0
4
0
1
2
1
2
6
3
3
0
1
2
3
1
0
0
0
7
2
2
1
1
G1
0
1
2
1
0
2
G3
5
3
3
3
2
0
1
2
0
1
2
3
4
5
6
7
1
0
0
1
5
4
5
6
G4
2
3
3
2
6
7
An undirected graph with n vertices and e edges ==> n head nodes and 2e list no
DFS : Depth-First Search



DFS is another popular search strategy.
It can do certain things that BFS cannot do. We will discuss some of
these algorithms in COMP 271 (so you cannot get rid of DFS after
COMP171).
DFS idea :

Whenever we visit a vertex v from another vertex u,
we recursively visit a neighbor of v that has not been
visited before until all neighbors of v have been visited.
Then we backtrack (return) to u.
Depth-First Search
38
Algorithm
Flag all vertices as not
visited
Visit v, and mark v as
visited.
For each unvisited neighbor.
make a recursive call
RDFS(w).
Depth-First Search
39
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
Visited Table (T/F)
0
F
-
1
F
-
2
F
-
3
F
-
4
F
-
5
F
-
6
F
-
7
F
-
8
F
-
9
F
-
Pred
Initialize visited
table (all empty F)
Initialize Pred to -1
Depth-First Search
40
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
Visited Table (T/F)
0
F
-
1
F
-
2
T
1
3
F
4
F
5
F
6
F
7
F
8
F
9
F
-
-
Pred
Mark 2 as visited
visit sequence= {2}
RDFS( 2 )
recursive call  RDFS(8)
Depth-First Search
41
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
Visited Table (T/F)
0
F
-
1
F
-
2
T
1
3
F
4
F
5
F
6
F
7
F
8
T
9
F
-
-
2
-
Pred
visit sequence= {2, 8}
Mark 8 as visited
Recursive
calls
RDFS( 2 )
RDFS(8)
recursive callRDFS(0)
Depth-First Search
42
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0}
Visited Table (T/F)
0
T
8
1
F
-
2
T
1
3
F
4
F
5
F
6
F
7
F
8
T
9
F
-
2
-
Pred
Mark 0 as visited
Recursive
calls
RDFS( 2 )
RDFS(8)
RDFS(0) -> no unvisited neighbor, return
to (backtrack) RDFS(8)
Depth-First Search
43
Example
Backtrack to 8
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0}
Recursive
calls
Visited Table (T/F)
0
T
8
1
F
-
2
T
1
3
F
4
F
5
F
6
F
7
F
8
T
9
F
-
2
-
Pred
RDFS( 2 )
RDFS(8)
recursive callRDFS(9)
Depth-First Search
44
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9}
Visited Table (T/F)
0
T
8
1
F
-
2
T
1
3
F
4
F
5
F
6
F
7
F
8
T
9
T
-
2
8
Pred
Mark 9 as visited
Recursive
calls
RDFS( 2 )
RDFS(8)
RDFS(9)
recursive callRDFS(1)
Depth-First Search
45
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9,
1}
RDFS( 2 )
Recursive
calls
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
F
4
F
5
F
6
F
7
F
8
T
9
T
-
2
8
Pred
Mark 1 as visited
RDFS(8)
RDFS(9)
RDFS(1)
recursive callRDFS(3)
Depth-First Search
46
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
RDFS( 2 )
visit sequence= {2, 8, 0, 9, 1, 3}
RDFS(8)
RDFS(9)
Recursive
RDFS(1)
calls
RDFS(3)
recursive callRDFS(4)
Depth-First Search
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
F
5
F
6
F
7
F
8
T
9
T
1
2
8
Pred
Mark 3 as visited
47
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
F
6
F
7
F
8
T
1
3
2
9
T
RDFS( 2 )
8
visit sequence= {2, 8, 0, 9, 1, 3, 4}
Pred
RDFS(8)
RDFS(9)
Mark 4 as visited
RDFS(1)
Recursive
calls
RDFS(3)
RDFS(4)  STOP all of 4’s neighbors have been visited
backtrack (return back) to call RDFS(3)
Depth-First Search
48
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
Backtrack to 3
visit sequence= {2, 8, 0, 9, 1, 3, 4}
Recursive
calls
RDFS( 2 )
RDFS(8)
RDFS(9)
RDFS(1)
RDFS(3)
recursive callRDFS(5)
Depth-First Search
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
F
6
F
7
F
8
T
9
T
1
3
2
8
Pred
49
Example
Adjacency List
0
8
source
2
9
1
7
3
4
Recursive
calls
6
5
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
F
7
F
8
T
9
T
1
3
3
2
8
RDFS( 2 )
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5}
Pred
RDFS(8)
RDFS(9)
Mark 5 as visited
RDFS(1)
RDFS(3)
RDFS(5)
3 is visited, recursive callRDFS(6)
Depth-First Search
50
Example
Adjacency List
0
8
source
2
9
1
7
3
4
Recursive
calls
6
5
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
F
8
T
1
3
3
5
2
RDFS( 2 )
9
T
8
RDFS(8)
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
Pred
RDFS(9)6}
RDFS(1)
Mark 6 as visited
RDFS(3)
RDFS(5)
RDFS(6)
recursive call RDFS(7)
Depth-First Search
51
Example
Adjacency List
0
8
source
2
9
1
7
3
4
Recursive
calls
6
5
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
1
3
3
5
6
2
RDFS( 2 )
9
T
8
RDFS(8)
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
Pred
RDFS(9)6, 7}
RDFS(1)
Mark 7 as visited
RDFS(3)
RDFS(5)
RDFS(6)
RDFS(7)
Depth-First Search
52
Example
Adjacency List
0
8
source
2
9
1
7
3
4
Recursive
calls
6
5
RDFS( 2 )
RDFS(8)
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
RDFS(9)6, 7}
RDFS(1)
RDFS(3)
RDFS(5)
RDFS(6)
RDFS(7)  no recursive call
Depth-First Search
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
53
Example
Adjacency List
0
8
source
2
9
1
7
3
4
Recursive
calls
6
5
RDFS( 2 )
RDFS(8)
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
RDFS(9)6, 7}
RDFS(1)
RDFS(3)
RDFS(5)
RDFS(6)  no recursive call
Depth-First Search
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
54
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
6, 7}
RDFS( 2 )
RDFS(8)
RDFS(9)
Recursive
RDFS(1)
calls
RDFS(3)
RDFS(5)  no recursive call
Depth-First Search
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
55
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
6, 7}
RDFS( 2 )
RDFS(8)
RDFS(9)
Recursive
calls
RDFS(1)
RDFS(3)  no recursive call
Depth-First Search
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
56
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
6, 7}
RDFS( 2 )
RDFS(8)
RDFS(9)
Recursive
calls
RDFS(1)  no recursive call
Depth-First Search
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
57
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
6, 7}
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
RDFS( 2 )
Recursive
RDFS(8)
calls
RDFS(9)  no recursive call
Depth-First Search
58
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
6, 7}
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
RDFS( 2 )
Recursive
RDFS(8)  no recursive call
calls
Depth-First Search
59
Example
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
6, 7}
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
RDFS( 2 )  no recursive call
Recursive
calls
Depth-First Search
60
Recover a path
Adjacency List
0
8
source
2
9
1
7
3
4
6
5
visit sequence= {2, 8, 0, 9, 1, 3, 4, 5,
6, 7}
Visited Table (T/F)
0
T
8
1
T
9
2
T
1
3
T
4
T
5
T
6
T
7
T
8
T
9
T
1
3
3
5
6
2
8
Pred
Try some examples.
Path(0) ->
Path(6) ->
Path(7) ->
Depth-First Search
61
DFS Tree

The edges that we traverse during DFS (or the
edges that we backtrack along) form a tree. We
usually call the rooted version (rooted at the
source) the DFS tree.
Depth-First Search
62
Minimum Spanning Trees
63
Problem: Laying Telephone Wire
Central office
64
Wiring: Naïve Approach
Central office
Expensive!
65
Wiring: Better Approach
Central office
Minimize the total length of wire connecting the customers
66
Minimum Spanning Tree (MST)
(see Weiss, Section 24.2.2)
A minimum spanning tree is a subgraph of an undirected weighted
graph G, such that
 it is a tree (i.e., it is acyclic)
 it covers all the vertices V
 contains |V| - 1 edges
 the total cost associated with tree
edges is the minimum among all
possible spanning trees
 not necessarily unique
67
Spanning Tree
• Definition
– A spanning tree of a graph G is a tree (acyclic) that
connects all the vertices of G once
• i.e. the tree “spans” every vertex in G
– A Minimum Spanning Tree (MST) is a spanning tree on a
weighted graph that has the minimum total weight
w(T ) 
 w( u, v ) such that w(T) is minimum
u , v T
Where might this be useful? Can also be used to approximate some
NP-Complete problems
Sample MST
• Which links to make this a MST?
6
4
9
5
14
2
10
15
3
8
Optimal substructure: A subtree of the MST must in turn be a MST of the
nodes that it spans. Will use this idea more in dynamic programming.
Kruskal’s MST Algorithm
• Idea:
– Go through the list of edges and make a forest
that is a MST
– At each vertex, sort the edges
– Edges with smallest weights examined and
possibly added to MST before edges with higher
weights
– Edges added must be “safe edges” that do not
ruin the tree property.
Kruskal’s Example
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
• A={ }, Make each element its own set. {a} {b} {c} {d} {e} {f} {g} {h}
• Sort edges.
• Look at smallest edge first: {c} and {f} not in same set, add it to A, union together.
• Now get {a} {b} {c f} {d} {e} {g} {h}
Kruskal Example
Keep going, checking next smallest edge.
Had: {a} {b} {c f} {d} {e} {g} {h}
{e} <> {h}, add edge.
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
Now get {a} {b} {c f} {d} {e h} {g}
Kruskal Example
Keep going, checking next smallest edge.
Had: {a} {b} {c f} {d} {e h} {g}
{a} <> {c f}, add edge.
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
Now get {b} {a c f} {d} {e h} {g}
Kruskal’s Example
Keep going, checking next smallest edge.
Had {b} {a c f} {d} {e h} {g}
{b} <> {a c f}, add edge.
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
Now get {a b c f} {d} {e h} {g}
Kruskal’s Example
Keep going, checking next smallest edge.
Had {a b c f} {d} {e h} {g}
{a b c f} = {a b c f}, dont add it!
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
Kruskal’s Example
Keep going, checking next smallest edge.
Had {a b c f} {d} {e h} {g}
{a b c f} = {e h}, add it.
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
Now get {a b c f e h} {d}{g}
Kruskal’s Example
Keep going, checking next smallest edge.
Had {a b c f e h} {d}{g}
{d} <> {a b c e f h}, add it.
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
Now get {a b c d e f h} {g}
Kruskal’s Example
Keep going, check next two smallest edges.
Had {a b c d e f h} {g}
{a b c d e f h} = {a b c d e f h}, don’t add it.
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
Kruskal’s Example
Do add the last one:
Had {a b c d e f h} {g}
a
6
4
9
5
b
c
14
d
2
10
e
f
g
15
3
h
8
Kruskal’s Algorithm
Kruskal(G,w)
; Graph G, with weights w
A  {}
; Our MST starts empty
for each vertex v V [G ] do Make-Set(v) ; Make each vertex a set
Sort edges of E by increasing weight
for each edge ( u, v )  E in order
; Find-Set returns a representative (first vertex) in the set
do if Find-Set(u)  Find-Set(v)
then A  A  {( u, v )}
Union(u,v)
; Combines two trees
return A
Prim’s Example
Example: Graph given earlier.
Q={ (e,0) (a,  ) (b,  ) (c,  ) (d,  ) (f,  ) (g,  ) (h,  ) }
inf
a
6
inf
4
inf
5
b
inf
9
c
14
d
2
inf
10
e
0/nil
f
3
h
8
inf
g
15
inf
Extract min, vertex e. Update neighbor if in Q and weight < key.
Prim’s Example
inf
a
6
14/e
4
inf
5
b
inf
9
c
14
d
2
inf
10
e
0/nil
f
3
h
8
inf
g
15
3/e
Q={ (a,  ) (b,14) (c,  ) (d,  ) (f,  ) (g,  ) (h,3) }
Extract min, vertex h. Update neighbor if in Q and weight < key
Prim’s Algorithm
inf
10/h
a
6
4
inf
9
5
b
inf
c
14
d
2
inf
10
e
0/nil
f
3
h
8
8/h
g
15
3/e
Q={ (a,  ) (b,10) (c,  ) (d,  ) (f,8) (g,  ) }
Extract min, vertex f. Update neighbor if in Q and weight < key
Prim’s Algorithm
inf
10/h
a
6
4
2/f
5
b
9
c
14
d
2
15/f
10
e
0/nil
f
3
h
8
inf
8/h
g
15
3/e
Q={ (a,  ) (b,10) (c, 2) (d,  ) (g,15) }
Extract min, vertex c. Update neighbor if in Q and weight < key
Prim’s Algorithm
4/c
a
6
5/c
4
2/f
5
b
9
c
14
d
2
15/f
10
e
0/nil
f
3
h
8
9/c
8/h
g
15
3/e
Q={ (a,4) (b,5) (d,9) (g,15) }
Extract min, vertex a. No keys are smaller than edges from a (4>2 on edge ac, 6>5 on edge ab) so nothing
done.
Q={ (b,5) (d,9) (g,15) }
Extract min, vertex b.
Same case, no keys are smaller than edges, so nothing is done.
Same for extracting d and g, and we are done.
Prim’s Algorithm
Get spanning tree by connecting nodes with their parents:
4/c
a
6
5/c
4
2/f
9
5
b
9/c
c
14
d
2
15/f
10
e
0/nil
f
3
h
3/e
8
8/h
g
15
Prim’s MST Algorithm
• Will find a MST but may differ from Prim’s if multiple MST’s are possible
MST-Prim(G,w,r)
; Graph G, weights w, root r
Q  V[G]
for each vertex u  Q do key[u]  
; infinite “distance”
key[r]  0
P[r]  NIL
while Q<>NIL do
u  Extract-Min(Q)
; remove closest node
; Update children of u so they have a parent and a min key val
; the key is the weight between node and parent
for each v Adj[u] do
if v Q & w(u,v)<key[v] then
P[v]  u
key[v]  w(u,v)
Shortest Path Algorithms
Goal: Find the shortest path between vertices in a weighted graph. We denote the shortest path between
vertex u and v as  (u,v). This is a very practical problem - the weights may represent the shortest
distance, shortest time, shortest cost, etc. There are several forms of this problem:
1.
2.
3.
4.
Single-source shortest path. Find the shortest distance from a source vertex s to every other vertex in
the graph.
Single-destination shortest path. Find a shortest path to a given destination vertex t from every vertex
v. This is just the reverse of single-source.
Single-pair shortest path. Find a shortest path between a pair of vertices. No algorithm is known for
this problem that runs asymptotically faster than the best single-source algorithms.
All-pairs shortest path. Find the shortest path between every vertex in the graph.
Note that BFS computes the shortest path on an unweighted graph.
Shortest Path?
Example: What is the shortest path from g to b?
a
6
4
1
5
b
c
14
2
15
6
e
d
4
f
g
15
3
h
8
Shortest Path



We will keep track of the parents of a vertex in P(v) then we can output either a shortest path using
parents or a shortest path tree.
A shortest path tree is a subset of the graph with the shortest path for every vertex from a source (root).
This is not unique.
We won’t use negative weights – requires a different algorithm, since negative cycles can be travelled
infinitely to make the weight cost lower and lower.
a
6
-12
1
5
b
c
14
2
15
6
e
d
4
f
g
15
3
h
8
Ex: Can travel the a,b,c loop over and over again, each time reducing weight cost by one!
One way to address this problem is to shift edge values up to remove negative values
Relaxation Example
Example: If we have that the distance from f to b is 15 (going through the direct edge), the process of
relaxation updates the d[f] to be 7 going through c.
a
6
15 to 7
4
1
5
b
c
14
2
15
6
e
d
4
f
g
15
3
h
8
Relax(u,v,w)
if d[v]>d[u]+w(u,v) then
d[v]  d[u]+w(u,v)
P[v]  u
; decrement distance
; indicate parent node
Dijkstra Example (0)
a
6
4
1
5
b
c
14
2
15
6
e
d
4
f
g
15
3
h
8
Dijkstra Example 1
Initialize nodes to  , parent to nil.
S={}, Q={(a,  ) (b,  ) (c,  ) (d,  ) (e,  ) (f,0) (g,  ) (h,  )}
a INF,NIL
4
6
INF,NIL
INF,NIL
5
b
c
14
2
15
6
INF,NIL
INF,NIL
d
1
e
4
INF,NIL
g
f
0,NIL
3
8
INF,NIL
S={f}. Update shorter paths.
h
Extract min, vertex f.
15
Dijkstra Example 2
Q={(a,  ) (b,15) (c, 2) (d, 4) (e,  ) (g, 15) (h,  )}
a INF,NIL
4
6
15,f
2,f
5
b
c
14
2
15
6
INF,NIL
4,f
d
1
e
4
15,f
g
f
0,NIL
3
8
INF,NIL
S={fc}. Update shorter paths.
h
Extract min, vertex c.
15
Dijkstra Example 3
Q={(a,6) (b,7) (d, 3) (e,  ) (g, 15) (h,  )}
a 6,c
6
4
7,c
2,f
5
b
c
14
2
15
6
INF,NIL
3,c
d
1
e
4
15,f
g
f
0,NIL
3
8
INF,NIL
S={fcd}. Update shorter paths (None)
h
Extract min, vertex d.
15
Dijkstra Example 4
a 6,c
6
4
7,c
2,f
5
b
c
14
2
15
6
INF,NIL
3,c
d
1
e
4
15,f
g
f
0,NIL
3
15
8
INF,NIL
Extract min, vertex a. S={fcda}. Update shorter paths (None)
Extract min, vertex b. S={fcdab}. Update shorter paths.
h
Dijkstra Example 5
Q={ (e,  ) (g, 15) (h, 13)}
a 6,c
6
4
7,c
2,f
5
b
c
14
2
15
6
INF,NIL
3,c
d
1
e
4
15,f
g
f
0,NIL
3
8
h
Extract min, vertex h.
15
13,b
S={fcdabh}. Update shorter paths
Dijkstra Example 6
Q={ ( (e, 16) (g, 15) }
a 6,c
6
4
7,c
2,f
5
b
c
14
2
15
6
16,h
3,c
d
1
e
4
15,f
g
f
0,NIL
3
15
8
h
13,b
Extract min, vertex g and h – nothing to update, done!
Dijkstra Example 7
• Can follow parent “pointers” to get the path
a
6
6,c
4
7,c
2,f
5
b
c
14
2
15
6
16,h
3,c
d
1
e
4
15,f
g
f
0,NIL
3
8
h
13,b
15
Dijkstra’s Algorithm
Dijkstra(G,w,s)
; Graph G, weights w, source s
for each vertex v G, set d[v]   and P[v]  NIL
d[s]  0
S  {}
Q  All Vertices in G with associated d
while Q not empty do
u  Extract-Min(Q)
S  S {u}
for each vertex v Adj[u] do
if d[v]>d[u]+w(u,v) then
d[v]  d[u]+w(u,v) ; decrement distance
P[v]  u
; indicate parent node
Dijkstra’s algorithm
• S = {1}
• for i = 2 to n do D[i] = C[1,i] if there is an edge from 1 to i,
infinity otherwise
• for i = 1 to n-1
{ choose a vertex w in V-S such that D[w] is min
add w to S (where S is the set of visited nodes)
for each vertex v in V-S do
D[v] = min(D[v], D[w]+c[w,v])
}
Where |V| = n
101