advanced algorithms

Download Report

Transcript advanced algorithms

Advanced Data Structures
By:
Nikhil Bhargava
Outline
• Introduction to ADT.
• Binary Search Tree (BST)
•
•
•
•
– Definition.
– Insertion, Deletion and Search.
– Performance.
Binary Heaps.
Binomial Trees.
Binomial Heaps.
Red-Black Trees
– Definition.
– Addition and Deletion.
• Skip Lists.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
2
Introduction
• What’s the need of ADT?
• Simply store everything in a big array.
• Be happy 
• Problem
•
– Search for a number k in a set of N numbers
Solution
– Store numbers in an array of size N.
– Iterate through array until find k.
– Number of checks
• Best Case : 1 (K=15)
• Worst case : N (k=27)
• Average Case : N/2
4/12/2016
Copyright © 2009. All Rights
Reserved by author
3
Introduction
• Better solution
– Store numbers in a Binary Search Tree (BST).
– Search tree until find k.
– Number of checks.
• Best case: 1 (k=15)
• Worst case: log2N (k=27)
• Average case: (log2N) / 2
4/12/2016
Copyright © 2009. All Rights
Reserved by author
4
Analysis
• What’s the difference b/w both the sols.
– (N) vs. (log2N).
4/12/2016
Copyright © 2009. All Rights
Reserved by author
5
Analysis
• Assume
– N=1,000,000,000
• 1 billion (Walmart transactions in 100 days)
– 1 GHz processor=10 cycles per second.
9
• Solution 1 (10 cycles per check)
– Worst case: 1 billion checks = 10 seconds
• Solution 2 (100 cycles per check)
– Worst case: 30 checks = 0.000003 seconds
4/12/2016
Copyright © 2009. All Rights
Reserved by author
6
Moral of the story!!!
• Choosing most appropriate data structure
makes the system efficient, fast and cost
effective.
• Appropriate data structures ease design
and improve performance.
• Designing appropriate data structure and
finding associated algorithms for a
problem is a major challenge.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
7
Binary Search Tree (BST)
• A BST is a binary tree storing keys (or
key- value entries) at its internal nodes
and satisfying the following property:
– Let u, v, and w be three nodes such that u is
in the left subtree of v and w is in the right
subtree of v. We have
key(u) ≤ key(v) ≤ key(w)
• External nodes do not store items.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
8
Binary Search Tree (BST)
• An in-order traversal of a binary search
trees visits the keys in increasing order.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
9
BST – Search Operation
• To search for a key k, we trace a
downward path starting at the root,
• The next node visited depends on the
outcome of the comparison of k with the
key of the current node.
• If we reach a leaf, the key is not found
and we return null.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
10
BST – Search Operation
Algorithm TreeSearch(k, v)
if T.isExternal (v)
return v
if k < key(v)
return TreeSearch(k, T.left(v))
else if k = key(v)
return v
else { k > key(v) }
return TreeSearch(k, T.right(v))
4/12/2016
Copyright © 2009. All Rights
Reserved by author
11
BST – Search Operation
• For e.g find (4)
Call TreeSearch(4,root)
4/12/2016
Copyright © 2009. All Rights
Reserved by author
12
BST – Insert Operation
• To perform operation insert(k, o), we
search for key k (using TreeSearch)
• Assume k is not already in the tree, and let
w be the leaf reached by the Search.
• We insert k at node w and expand w into
an internal node.
• For e.g. insert (5)
4/12/2016
Copyright © 2009. All Rights
Reserved by author
13
BST – Insert Operation
4/12/2016
Copyright © 2009. All Rights
Reserved by author
14
BST – Delete Operation
• To perform operation remove(k), we search
for key k.
• Assume key k is in the tree, and let v be
the node storing k.
• If node v has a leaf child w, we remove v
and w from the tree with operation
removeExternal(w), which removes w and
its parent .
• For e.g. remove (4)
4/12/2016
Copyright © 2009. All Rights
Reserved by author
15
BST – Delete Operation
4/12/2016
Copyright © 2009. All Rights
Reserved by author
16
BST – Delete Operation
• We consider the case where the key k to be
removed is stored at a node v whose children
are both internal
– we find the internal node w that follows v in an
inorder traversal.
– we copy key(w) into node v
– we remove node w and its left child z (which must be
a leaf) by means of operation removeExternal(z).
– Add key to inorder successor of w.
– For e.g remove (3)
4/12/2016
Copyright © 2009. All Rights
Reserved by author
17
BST – Delete Operation
4/12/2016
Copyright © 2009. All Rights
Reserved by author
18
BST – Delete Operation
4/12/2016
Copyright © 2009. All Rights
Reserved by author
19
BST - Performance
• Consider a dictionary with n items implemented
by means of a binary search tree of height h.
– the space used is O(n).
– methods find, insert and remove take O(h) time.
• The height h is O(n) in the worst case and O(log
n) in the best case.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
20
BST - Performance
4/12/2016
Copyright © 2009. All Rights
Reserved by author
21
Binary Heap: Definition
• Binary heap.
– Almost complete binary tree.
• filled on all levels, except last, where filled from left to right.
– Min-heap ordered.
• every child greater than (or equal to) parent.
06
14
78
83
4/12/2016
45
18
91
81
47
77
84
53
99
64
Copyright © 2009. All Rights
Reserved by author
22
Binary Heap: Properties
• Properties.
– Min element is in root.
– Heap with N elements has height = log2 N.
6
14
78
83
4/12/2016
45
18
91
81
N = 14
Height = 3
47
77
84
53
99
64
Copyright © 2009. All Rights
Reserved by author
23
Binary Heaps: Array Impl.
• Implementing binary heaps.
– Use an array: no need for explicit parent or child
pointers.
• Parent(i) = i/2
06
• Left(i)
= 2i
• Right(i) = 2i + 1 1
4/12/2016
14
45
2
3
78
18
47
53
4
5
6
7
83
91
81
8
9
10
77
84
99
64
11
12
13
14
Copyright © 2009. All Rights
Reserved by author
24
Binary Heap: Insertion
• Insert element x into heap.
– Insert into next available slot.
– Bubble up until it's heap ordered.
• Peter principle: nodes
06 rise to level of incompetence
14
78
83
4/12/2016
45
18
91
81
47
77
84
53
99
64
Copyright © 2009. All Rights
Reserved by author
42
next free slot
25
Binary Heap: Insertion
• Insert element x into heap.
– Insert into next available slot.
– Bubble up until it's heap ordered.
• Peter principle: nodes rise to level of incompetence
06
swap with parent
14
78
83
4/12/2016
45
18
91
81
47
77
84
53
99
64
Copyright © 2009. All Rights
Reserved by author
42
26
Binary Heap: Insertion
• Insert element x into heap.
– Insert into next available slot.
– Bubble up until it's heap ordered.
• Peter principle: nodes rise to level of incompetence
06
14
45
swap with parent
78
83
4/12/2016
18
91
81
47
77
84
42
99
64
Copyright © 2009. All Rights
Reserved by author
53
42
27
Binary Heap: Insertion
• Insert element x into heap.
– Insert into next available slot.
– Bubble up until it's heap ordered.
• Peter principle: nodes rise to level of
incompetence 06
– O(log N) operations.
14
42
stop: heap ordered
78
83
4/12/2016
18
91
81
47
77
84
45
99
64
Copyright © 2009. All Rights
Reserved by author
53
28
Binary Heap: Decrease Key
• Decrease key of element x to k.
– Bubble up until it's heap ordered.
– O(log N) operations.
06
14
78
83
4/12/2016
42
18
91
81
47
77
84
45
99
64
Copyright © 2009. All Rights
Reserved by author
53
29
Binary Heap: Delete Min
• Delete minimum element from heap.
– Exchange root with rightmost leaf.
– Bubble root down until it's heap ordered.
• power struggle principle: better subordinate is promoted.
06
14
78
83
4/12/2016
42
18
91
81
47
77
84
45
99
64
Copyright © 2009. All Rights
Reserved by author
53
30
Binary Heap: Delete Min
• Delete minimum element from heap.
– Exchange root with rightmost leaf.
– Bubble root down until it's heap ordered.
• power struggle principle: better subordinate is promoted.
53
14
78
83
4/12/2016
42
18
91
81
47
77
84
45
99
64
Copyright © 2009. All Rights
Reserved by author
06
31
Binary Heap: Delete Min
• Delete minimum element from heap.
– Exchange root with rightmost leaf.
– Bubble root down until it's heap ordered.
• power struggle principle: better subordinate is promoted.
53
14
78
83
4/12/2016
18
91
81
exchange with left child
42
47
77
84
45
99
64
Copyright © 2009. All Rights
Reserved by author
32
Binary Heap: Delete Min
• Delete minimum element from heap.
– Exchange root with rightmost leaf.
– Bubble root down until it's heap ordered.
• power struggle principle: better subordinate is promoted.
14
53
78
83
4/12/2016
18
91
81
exchange with right child
42
47
77
84
45
99
64
Copyright © 2009. All Rights
Reserved by author
33
Binary Heap: Delete Min
• Delete minimum element from heap.
– Exchange root with rightmost leaf.
– Bubble root down until it's heap ordered.
• power struggle principle: better subordinate is promoted.
14
– O(log N) operations.
18
42
stop: heap ordered
78
83
4/12/2016
53
91
81
47
77
84
45
99
64
Copyright © 2009. All Rights
Reserved by author
34
Binary Heap: Heapsort
• Heapsort.
– Insert N items into binary heap.
– Perform N delete-min operations.
– O(N log N) sort.
– No extra storage.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
35
Binary Heap: Union
• Combine two binary heaps H1 and H2 into one heap.
• No easy solution.
•
– (N) operations apparently required.
Can support fast union with fancier heaps.
H1
H2
14
11
78
41
4/12/2016
18
91
81
53
77
84
Copyright © 2009. All Rights
Reserved by author
62
99
64
36
Binomial Tree
Bk
B0
• Binomial tree.
– Recursive definition:
Bk-1
Bk-1
B4/12/2016
B1
0
B2
Copyright © 2009. All Rights
B3
Reserved
by author
B4
37
Binomial Tree
• Useful properties of order k binomial tree Bk
–
–
–
–
Number of nodes = 2k.
Height = k.
Degree of root = k.
Deleting root yields binomial
trees Bk-1, … , B0.
• Proof.
Bk
+1
Bk
B2
B1
B
0
– By induction on k.
B4/12/2016
B1
0
B2
Copyright © 2009. All Rights
B3
Reserved
by author
B4
38
Binomial Tree
• A property useful for naming the data structure.
– Bk has
depth
0
k 
 
i
nodes at depth i.
 4
 6
2
depth
1
depth
2
depth
3
depth
4
4/12/2016
B4Rights
Copyright © 2009. All
Reserved by author
39
Binomial Heap
• Binomial heap (Vuillemin, 1978).
– Sequence of binomial trees that satisfy binomial heap
property.
• each tree is min-heap ordered.
• 0 or 1 binomial tree of order k.
8
45
55
4/12/2016
30
23
32
24
22
48
29
10
31
17
6
3
44
37
18
50
Copyright © 2009. All Rights
B Reserved by author
4
B1
B0
40
Binomial Heap: Implementation
• Implementation.
– Represent trees using left-child, right sibling pointers.
• three links per node (parent, left, right).
– Roots of trees connected with singly linked list.
heap
• degrees of trees strictly decreasing from left to
right.
48
29
10
31
17
6
3
44
37
50
18
4/12/2016
Binomial Heap
18
37
29
48
50
3
6
10
31
17
Copyright © 2009. All Rights
Reserved by author
44
Leftist Power-of-2 Heap
41
Binomial Heap: Properties
• Properties of N-node binomial heap.
– Min key contained in root of B0, B1, . . . , Bk.
– Contains binomial tree Bi iff bi = 1 where bn b2b1b0 is
binary representation of N.
– At most log2 N + 1 binomial trees.
– Height  log2 N.
3
18
6
8
45
55
4/12/2016
30
23
32
24
22
48
29
10
31
17
44
50
Copyright © 2009. All Rights
B Reserved by author
4
37
N = 19
# trees = 3
height = 4
binary = 10011
B1
B0
42
Binomial Heap: Union
• Create heap H that is union of heaps H' and H''.
– "Mergeable heaps."
– Easy if H' and H'' are each order k binomial trees.
• connect roots of H' and H.''
• choose smaller key to be root of H.
6
8
45
55
4/12/2016
30
23
32
24
22
48
29
10
31
17
44
50
H'
Copyright © 2009. All Rights
H''
Reserved by author
43
Binomial Heap: Union
8
30
45
+
32
23
24
22
48
29
10
31
17
6
3
44
37
7
33
25
12
50
28
55
41
19 + 7 = 26
4/12/2016
15
18
Copyright © 2009. All Rights
Reserved by author
+
1
1
1
1
0
0
1
1
0
0
1
1
1
1
1
0
1
0
44
Binomial Heap: Union
8
30
45
+
32
23
24
22
48
29
10
31
17
6
3
44
37
7
33
25
12
50
28
55
4/12/2016
15
18
41
Copyright © 2009. All Rights
Reserved by author
45
12
Binomial Heap: Union
8
30
45
+
32
23
24
22
48
29
10
31
17
6
3
44
37
15
7
33
25
18
12
50
28
55
4/12/2016
18
41
Copyright © 2009. All Rights
Reserved by author
46
7
3
12
37
18
25
8
30
45
+
32
23
24
22
48
29
10
31
17
6
3
4
37
15
7
33
25
18
12
50
28
55
41
12
18
4/12/2016
Copyright © 2009. All Rights
Reserved by author
47
3
28
15
7
33
25
37
7
3
12
37
18
25
41
8
30
45
+
32
23
24
22
48
29
10
31
17
6
3
44
37
15
7
33
25
18
12
50
28
55
41
12
18
4/12/2016
Copyright © 2009. All Rights
Reserved by author
48
3
28
15
7
33
25
37
7
3
12
37
18
25
41
8
30
45
+
32
23
24
22
48
29
10
31
17
6
3
44
37
7
33
25
12
50
28
55
41
28
4/12/2016
15
18
Copyright © 2009. All Rights
Reserved by author
15
7
33
25
3
12
37
18
41
49
3
28
15
7
33
25
37
7
3
12
37
18
25
41
8
30
45
+
32
23
22
24
48
29
10
31
17
6
3
44
37
15
7
33
25
18
12
50
28
55
41
6
8
45
4/12/2016
55
30
23
32
24
22
48
50
29
10
31
17
44
28
15
7
33
25
3
12
37
18
41
Copyright © 2009. All Rights
Reserved by author
50
Binomial Heap: Union
• Create heap H that is union of heaps H' and H''.
– Analogous to binary addition.
• Running time O(log N)
– Proportional to number of trees in root lists  2(
log2 N + 1).
19 + 7 = 26
4/12/2016
Copyright © 2009. All Rights
Reserved by author
+
1
1
1
1
0
0
1
1
0
0
1
1
1
1
1
0
1
0
51
Binomial Heap: Delete Min
• Delete node with minimum key in binomial heap H.
– Find root x with min key in root list of H, and delete
– H'  broken binomial trees
– H  Union(H', H)
• Running time O(log N)
8
45
55
4/12/2016
30
23
32
24
22
48
29
10
31
17
3
6
44
37
18
H
50
Copyright © 2009. All Rights
Reserved by author
52
Binomial Heap: Delete Min
• Delete node with min key in binomial heap H.
– Find root x with min key in root list of H, and delete
– H'  broken binomial trees
– H  Union(H', H)
• Running time O(log N)
8
45
30
23
32
24
55
4/12/2016
22
48
29
10
31
17
6
44
18
37
H
50
H'
Copyright © 2009. All Rights
Reserved by author
53
Binomial Heap: Decrease Key
• Decrease key of node x in binomial heap H.
– Suppose x is in binomial tree Bk.
– Bubble node x up the tree if x is too small.
• Running time O(log N)
– Proportional to depth of node x  log2 N .
depth = 3
x
55
4/12/2016
8
30
23
32
24
22
48
29
10
31
17
3
6
44
37
18
H
50
Copyright © 2009. All Rights
Reserved by author
54
Binomial Heap: Delete
• Delete node x in binomial heap H.
– Decrease key of x to -.
– Delete min.
• Running time O(log N)
4/12/2016
Copyright © 2009. All Rights
Reserved by author
55
Binomial Heap: Insert
• Insert a new node x into binomial heap H.
– H'  MakeHeap(x)
– H  Union(H', H)
• Running time. O(log N)
8
45
30
23
32
24
55
4/12/2016
22
48
29
10
31
17
3
6
44
37
18
H
x
H'
50
Copyright © 2009. All Rights
Reserved by author
56
Binomial Heap: Sequence
of Inserts
48
29
10
31
17
3
6
44
37
x
50
• Insert a new node x into binomial heap H.
–
–
–
–
If
If
If
If
4/12/2016
N
N
N
N
=
=
=
=
.......0,
......01,
.....011,
....0111,
then
then
then
then
only
only
only
only
1
2
3
4
steps.
steps.
steps.
steps.
Copyright © 2009. All Rights
Reserved by author
57
Binomial Heap: Sequence of
Inserts
• Inserting 1 item can take (log N) time.
– If N = 11...111, then log2N steps.
• But, inserting sequence of N items takes O(N)
time!
– (N/2)(1) + (N/4)(2) + (N/8)(3) + . . .  2N
N
– Amortized analysis.
 nn  2  NN
2
n1 2
– Basis for getting most
 2

1
2 N 1
operations down to constant time.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
58
Red-Black Tree
•
•
•
•
•
•
•
•
Popular alternative to the BST tree (Why??).
Operations take O(log N) time in worst case.
Height is at most 2log(N+1).
A red-black tree is a binary search tree with one
extra attribute for each node: the color, which is
either red or black.
The root is black.
If node is red, its children must be black.
Every path from a node to a null reference must
contain the same number of black nodes.
Basic operations to conform with rules are color
changes and tree rotations.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
59
Red-Black Tree
Theorem 1 – In a red-black tree, at least half the nodes
on any path from the root to a leaf must be black.
Proof – If there is a red node on the path, there must be
a corresponding black node.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
60
Red-Black Tree
Theorem 2 – In a red-black tree, no path from any
node N, to a leaf is more than twice as long as any
other path from N to any other leaf.
Proof: By definition, every path from a node to any
leaf contains the same number of black nodes. By
Theorem1, a least ½ the nodes on any such path
are black. Therefore, there can no more than twice
as many nodes on any path from N to a leaf as on
any other path. Therefore the length of every path
is no more than twice as long as any other path.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
61
Red-Black Tree
Theorem 3 – A red-black tree with n internal nodes has
height h <= 2 log(n + 1).
Proof: Let h be the height of the red-black tree with root
x. By Theorem 1,
bh(x) >= h/2
From Theorem 1, n >= 2bh(x) - 1
Therefore n >= 2 h/2 – 1
n + 1 >= 2h/2
log(n + 1) >= h/2
2log(n + 1) >= h
4/12/2016
Copyright © 2009. All Rights
Reserved by author
62
Bottom-Up Insertion
• Cases:
– 0: X is the root – color it black.
– 1: Both parent and uncle are red – color parent and
uncle black, color grandparent red, point X to
grandparent, check new situation.
– 2 (zig-zag): Parent is red, but uncle is black. X and its
parent are opposite type children – color grandparent
red, color X black, rotate left on parent, rotate right
on grandparent.
– 3 (zig-zig): Parent is red, but uncle is black. X and its
parent are both left or both right children – color
parent black, color grandparent red, rotate right on
grandparent.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
63
Top-Down Red-Black Trees
• In T-Down insertion, the corrections are done
•
•
•
while traversing down the tree to the insertion
point.
When the actual insertion is done, no further
corrections are needed, so no need to traverse
back up the tree.
So, T-Down insertion can be done iteratively
which is generally faster.
Insertion is always done as a leaf (as in ordinary
BST insertion).
4/12/2016
Copyright © 2009. All Rights
Reserved by author
64
Process
• On the way down, when we see a node X that
•
•
has two red children, we make X red and its two
children black.
If X’s parent is red, we can apply either the
single or double rotation to keep us from having
two consecutive red nodes.
X’s parent and the parent’s sibling cannot both
be red, since their colors would already have
been flipped in that case.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
65
Example: Insert 45
30
15
Two red
children
50
40
4/12/2016
85
60
20
10
5
70
65
80
90
55
Copyright © 2009. All Rights
Reserved by author
66
Example (Cont.)
30
15
10
5
70
flip colors two red nodes
50
40
4/12/2016
85
60
20
65
80
90
55
Copyright © 2009. All Rights
Reserved by author
67
Example (Cont.): Do a single
rotation
30
15
10
60
20
70
50
40
55
85
65
5
80
4/12/2016
Copyright © 2009. All Rights
Reserved by author
90
68
Example (Cont.): Now Insert 45
30
15
10
60
20
70
50
40
55
85
65
5
80
90
45
4/12/2016
Copyright © 2009. All Rights
Reserved by author
69
Important note
• Since the parent of the newly inserted node was
•
•
black, we are done.
Had the parent of the inserted node been red,
one more rotation would have had to be
performed.
Although red-black trees have slightly weaker
balancing properties, their performance in
experimentally almost identical to that of AVL
trees.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
70
Top-Down Deletions
• Recall that in deleting from a binary search tree,
•
•
•
•
the only nodes which are actually removed are
leaves or nodes with exactly one child.
Nodes with two children are never removed.
Their contents are just replaced.
If the node to be deleted is red, there is no
problem - just delete the node.
If the node to be deleted is black, its removal
will violate property.
The solution is to ensure that any node to be
deleted is red.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
71
Deterministic Skip Lists
• A probabilistically balanced
linked list.
• Invented in 1986 by William
Pugh.
• Definition: Two elements are
linked if there exists at least
one link going from one to
another.
• Definition: The gap size
between two elements linked
at height h is equal to the
number of elements of height
h-1 between them.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
72
Skip List
xtra pointers every eighth item - full structure
3
6
7
21
9
12
17
19
25
26
NIL
skip list - same link distribution, random choice
6
3
4/12/2016
7
9
12
17
19
25
21
Copyright © 2009. All Rights
Reserved by author
NIL
26
73
Search time
• In the deterministic version (a-d):
– in
– in
– in
– in
a, we need to check at most n nodes
b, at most n/2+1 nodes
c, at most n/4+2 nodes
general, at most log N nodes
• Efficient search, but impractical insertion
and deletion.
• Not a real time data structure.
• Probabilistic in nature.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
74
Levels
• A node with k forward level
•
pointers is called a
level k node.
If every (2i)th node
has a pointer 2i nodes
ahead, they have the
following distribution:
4/12/2016
percent
1
50
2
25
3
12.5
…
…
Copyright © 2009. All Rights
Reserved by author
75
Central idea in skip lists
• Choose levels of nodes randomly, but in
the same proportions (as in e).
• A node’s i th forward pointer, points to the
next node of level i or higher.
• Insertions and deletions require only local
modifications.
• A node’s level never changes after first
being chosen.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
76
Insertion
• To perform insertion, we must make sure
that when a new node of height h is
added, it doesn’t create a gap of four
heights of h node (in 1-2-3 deterministic
skip list).
4/12/2016
Copyright © 2009. All Rights
Reserved by author
77
Data Structure Selection in
Programming
• Tough question (Agree or not !!!).
• Comes with practice and experience.
• Remember the golden rule.
Data Structure will affect the runtime of your
algorithm, but not the result
• Pick the data structure depending on the
problem
– time vs. space tradeoff (e.g. hash tables provide fast
search but need many more slots than entries).
4/12/2016
Copyright © 2009. All Rights
Reserved by author
78
Data Structure Selection in
Programming
• Think about these questions before
deciding on the right data structure
– What are my end objectives?
– What features does the data structure
provide?
– what are costs (time, space, etc.) associated
with them? (Free lunch is only available in a mouse trap!!)
– How well does this combination suit the
particular algorithm I have in mind?
4/12/2016
Copyright © 2009. All Rights
Reserved by author
79
Some tips…
• At least 90% of the data structures used in
•
•
•
•
totality are simple arrays.
The rest are mainly linked lists or binary trees.
Very rarely is it worth using a hash or a red
black tree.
If you need to find something with equality only,
then hash.
If you need to find a range of something (e.g.
where ‘x’ between ‘a’ and ‘b’ or where ‘y’ >= ‘z’)
then use a tree or skiplist.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
80
Some tips…
• If you need to order a list by some kind of
•
•
•
importance and modify the list on the fly, then a
priority queue is wanted.
If you need to solve problems involving some sort
of web like structure, then a graph is often
wanted.
If you want to access data by content rather than
by id number, there’s nothing like a hash table.
If you want to keep data sorted, despite arbitrary
inserts and deletes, you are looking at a red black
tree.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
81
References
[1] Advanced Data Structures by Goodrich
and Tamassia, 2nd edition.
[2] Introduction to Algorithms by Cormen
et. al. , 2nd edition.
[3] Binary and Binomial Heaps, Princeton
university, notes by Kevin Wayne.
[4] Lecture notes, Dr. S.N Maheshwari, CSEIIT Delhi.
4/12/2016
Copyright © 2009. All Rights
Reserved by author
82
Questions
4/12/2016
Copyright © 2009. All Rights
Reserved by author
83
4/12/2016
Copyright © 2009. All Rights
Reserved by author
84