Transcript AVL Trees
CSE 326: Data Structures
Trees
Lecture 7: Wednesday, Jan 23, 2003
1
Outline
• Finish discussion on random binary search
trees (BST)
• AVL trees
• Reading assignment for this week:
Weiss: 4.3, 4.4, 4.5, and 4.7
2
The Average Depth of a BST
• Insert the elements 1 <2 < ... < n in some
order, starting with the empty tree
• For each permutation, :
– T = the BST after inserting (1), (2) , ... ,
(n)
• The Average Depth:
H(n) ( height(T π ))/n!
π
3
The Average Depth of a BST
• The average depth of a BST is:
H(n) = (log n)
• For some , height(T) = O(log n)
• For other , height(T) = O(n)
• But the average is O(log n)
• Please read the proof in the book and/or slides !
4
n versus log n
Why is average depth of BST's made from
random inputs different from the average
depth of all possible BST's?
log n
n
Because there are more ways to build shallow
trees than deep trees!
5
Random Input vs. Random Trees
For three items, the
shallowest tree is
twice as likely as
any other – effect
grows as n
increases. For n=4,
probability of
getting a shallow
tree > 50%
Inputs
1,2,3
3,2,1
1,3,2
3,1,2
2,1,3
2,3,1
Trees
6
Average cost
• The average, amortized cost of n insert/find
operations is O(log(n))
• But the average, amortized cost of n
insert/find/delete operations can be as bad as
(n)
– Deletions make life harder (recall stretchy arrays)
• Need guaranteed cost O(log n) – next
7
Beauty is Only (log n) Deep
• Binary Search Trees are fast if they’re shallow
e.g.: complete
• Problems occur when one branch is much
longer than the other
How to capture the notion of a “sort of” complete
tree?
8
Balance
t
5
6
balance = height(left subtree) - height(right subtree)
• convention: height of a “null” subtree is -1
• zero everywhere perfectly balanced
• small everywhere balanced enough: (log n)
– Precisely: Maximum depth is 1.44 log n
9
AVL Tree
(Adelson-Velskii Landis)
• Binary search tree
properties
• Balance of every node is
-1 b 1
• Tree re-balances itself
after every insert or
delete
8
5
2
11
6
4
10
7
9
What is the balance of each node in this tree?
12
13 14
15
10
AVL Tree Data Structure
3
10
1
3
height
15
0
0
2
data
children
2
5
10
9
0
1
12
20
0
17
0
30
11
Not An AVL Tree
4
10
1
4
height
15
0
0
2
data
children
3
5
10
9
0
2
12
20
1
0
17
30
0
18
12
Bad Case #1
Insert(small)
Insert(middle)
Insert(tall)
S
2
1
M
0
T
13
Single Rotation
S
2
1
M
1
M
0
0
T
0
S
T
Basic operation used in AVL trees:
A right child could legally have its
parent as its left child.
14
General Case: Insert Unbalances
a
h
h+1
a
h-1
h+1 b
h-1
b
X
X
h-1
h
h-1
Z
Y
h+2
h-1
Y
h+1
Z
b
h
h
a
h-1
h-1
Z
Y
X
15
Properties of General Insert +
Single Rotation
• Restores balance to a lowest point in tree
where imbalance occurs
• After rotation, height of the subtree (in the
example, h+1) is the same as it was before
the insert that imbalanced it
• Thus, no further rotations are needed
anywhere in the tree!
16
Bad Case #2
Insert(small)
Insert(tall)
Insert(middle)
S
2
1
Why won’t a single
rotation (bringing T up to
the top) fix this?
T
0
M
17
Double Rotation
S
2
S
1
T
M
0
2
1
M
1
M
0
T
0
0
S
T
18
General Double Rotation
a
h+3
h+2
h+2
c
h
b
h+1
h+1
b
h
h+1
Z
c
a
h
h
h
Y
h
W
Y
W
X
Z
X
• Initially: insert into X unbalances tree (root height goes to h+3)
• “Zig zag” to pull up c – restores root height to h+2, left subtree
height to h
19
Another Double Rotation Case
a
h+3
h+2
h+2
c
h
b
h+1
h+1
b
h
h+1
Z
c
h
h
h
W
X
a
h
X
W
Y
Z
Y
• Initially: insert into Y unbalances tree (root height goes to h+2)
• “Zig zag” to pull up c – restores root height to h+1, left subtree
height to h
20
Insert Algorithm
•
•
•
•
Find spot for value
Hang new node
Search back up looking for imbalance
If there is an imbalance:
“outside”: Perform single rotation and exit
“inside”: Perform double rotation and exit
21
AVL Insert Algorithm
Node
if
if
if
insert(Comparable x, Node root){
( root == NULL ) return new Node(x);
(x == root.key) return root;
(x < root.key){
root.left = insert( x, root.left );
if (root unbalanced) { rotate... } }
else { // x > root.key
root.right = insert( x, root.right );
if (root unbalanced) { rotate... } }
root.height = max(root.left.height,
root.right.height)+1;
return root;
}
22
Deletion (Really Easy Case)
3
10
Delete(17)
2
2
5
15
1
0
2
9
0
0
1
12
20
0
3
0
17 30
23
Deletion (Pretty Easy Case)
3
10
Delete(15)
2
2
5
15
1
0
2
9
0
0
1
12
20
0
3
0
17 30
24
Deletion (Pretty Easy Case cont.)
3
10
Delete(15)
2
2
5
17
1
0
2
9
0
0
12
1
20
0
3
30
25
Deletion (Hard Case #1)
3
10
Delete(12)
2
2
5
17
1
0
2
9
0
0
12
1
20
0
3
30
26
Single Rotation on Deletion
3
3
10
2
2
5
0
9
0
2
17
1
2
10
5
20
1
1
20
0
2
0
3
1
30
9
0
0
17
30
0
3
What is different about
deletion than insertion?
27
Deletion (Hard Case)
4
10
Delete(9)
2
3
5
17
1
0
2
2
12
9
0
2
0
3
20
1 0
1
11 15 18 30
0
13
0
33
28
Double Rotation on Deletion
4
10
10
2
3
5
2
0
0
2
1
0
2
12
5
20
1 0
1
11 15 18 30
0
33
2
0
11 15 18 30
13
17
0
20
1 0
3
3
2
12
2
3
1
17
1
0
Not
finished!
4
0
0
33
13
29
Deletion with Propagation
4
10
1
3
17
0
0
2
What different
about this case?
3
2
2
12
5
0
20
1 0
1
We get to choose whether
to single or double rotate!
11 15 18 30
0
13
0
33
30
Propagated Single Rotation
4
4
10
17
1
3
17
0
0
2
3
3
2
10
0
20
1 0
2
1
0
0
18
0 0
1
30
1
0
2 5 11 15
0
33
0
12
3
11 15 18 30
13
20
1
2
12
5
2
33
0
13
31
Propagated Double Rotation
4
4
10
12
1
3
17
0
0
2
2
3
2
10
1
2
12
5
0
20
1 0
1
0
0
17
0
1
0
2 5
2
15
11
3
11 15 18 30
13
3
0
13
20
0
1
18 30
0
0
33
33
32
AVL Deletion Algorithm
• Recursive
• Iterative
1. If at node, delete it
2. Otherwise recurse to
find it in
3. Correct heights
a. If imbalance #1,
single rotate
b. If imbalance #2
(or don’t care),
double rotate
1. Search downward for
node, stacking
parent nodes
2. Delete node
3. Unwind stack,
correcting heights
a. If imbalance #1,
single rotate
b. If imbalance #2
(or don’t care)
double rotate
33
Pros and Cons of AVL Trees
Pro:
• All operations guaranteed O(log N)
• The height balancing adds no more than a
constant factor to the speed of insertion
Con:
• Space consumed by height field in each node
• Slower than ordinary BST on random data
Can we guarantee O(log N) performance with less
overhead? Splay trees next time
34