Analysis of Algorithms, cont.

Download Report

Transcript Analysis of Algorithms, cont.

Foundations of Software Design
Fall 2002
Marti Hearst
Lecture 11: Analysis of Algorithms, cont.
1
Function Pecking Order
• In increasing order
log(n)
1
2
3
4
5
6
7
8
9
10
n
2
4
8
16
32
64
128
256
512
1024
n^2
4
16
64
256
1024
4096
16384
65536
262144
1048576
n^5
2^n
32
1024
32768
1048576
33554432
1.07E+09
3.44E+10
1.1E+12
3.52E+13
1.13E+15
4
16
256
65536
4.29E+09
1.84E+19
3.4E+38
1.16E+77
1.3E+154
#NUM!
Where does n log n fit in?
Adapted from Goodrich & Tamassia
2
Plot them!
Both x and y linear scales
Convert y axis to log scale
1.6E+154
1.4E+154
1.2E+154
log(n)
n
n^2
n^5
2^n
1E+154
8E+153
6E+153
4E+153
2E+153
0
1
2
3
4
5
6
7
8
9 10
(that jump for large n happens because
the last number is out of range)
1E+156
1E+144
1E+132
1E+120
1E+108
1E+96
1E+84
1E+72
1E+60
1E+48
1E+36
1E+24
1E+12
1
log(n)
n
n^2
n^5
2^n
1
2
3
4
5
6
7
8
9
10
Notice how much bigger 2^n
is than n^k
This is why exponential
growth is BAD BAD BAD!!
3
60000
More Plots
50000
40000
log n
n
n log n
30000
20000
log n
1
2
3
4
5
6
7
8
9
10
11
12
n
2
4
8
16
32
64
128
256
512
1024
2048
4096
n log n
2
8
24
64
160
384
896
2048
4608
10240
22528
49152
10000
0
1
2 3
4 5
6 7
8 9 10 11 12
100000
10000
1000
log n
n
n log n
100
10
1
1 2 3 4 5 6 7 8 9 10 11 12
4
Let’s Count Some Beer
• A well-known “song”
– “100 bottles of beer on the wall, 100 bottles of beer; you
take one down, pass it around, 99 bottles of beer on the
wall.”
– “99 bottles of beer on the wall, 99 bottles of beer; you
take one down, pass it around, 98 bottles of beer on the
wall.”
– …
– “1 bottle of beer on the wall, 1 bottle of beer, you take it
down, pass it around, no bottles of beer on the wall.”
– HALT.
• Let’s change the song to “N bottles of beer on the wall”.
The number of bottles of beer passed around is Order
what?
5
Let’s Count Some Ants
• Another song:
– The ants go marching 1 by 1
– The ants go marching 2 by 2
– The ants go marching 3 by 3
• How ants are in the lead in each wave of ants?
1+2+3+…+n
• Does this remind you of anything?
n( n  1)
2
i


O
(
n
)

2
i 1
n
6
Graph it!
Let’s plot beer(n) versus ants(n)
90
80
total # items
70
60
50
Gifts
Ants
Beer Bottles
40
30
20
10
0
1 2 3 4 5 6 7 8 9 10 11 12
n
7
Definition of Big-Oh
A running time is O(g(n)) if
there exist constants n0 > 0
and c > 0 such that for all
problem sizes n > n0, the
running time for a problem
of size n is at most c(g(n)).
In other words, c(g(n)) is
an upper bound on the
running time for sufficiently
large n.
http://www.cs.dartmouth.edu/~farid/teaching/cs15/cs5/lectures/0519/0519.html
c g(n)
8
The Crossover Point
One function starts out faster for small values of n.
But for n > n0, the other function is always faster.
Adapted from http://www.cs.sunysb.edu/~algorith/lectures-good/node2.html
9
More formally
• Let f(n) and g(n) be functions mapping nonnegative
integers to real numbers.
• f(n) is (g(n)) if there exist positive constants n0
and c such that for all n>=n0, f(n) <= c*g(n)
• Other ways to say this:
f(n) is order g(n)
f(n) is big-Oh of g(n)
f(n) is Oh of g(n)
f(n)  O(g(n))
(set notation)
10
Comparing Running Times
Adapted from Goodrich & Tamassia
11
Analysis Example:
Phonebook
• Given:
– A physical phone book
• Organized in alphabetical order
– A name you want to look up
– An algorithm in which you search through the book
sequentially, from first page to last
– What is the order of:
• The best case running time?
• The worst case running time?
• The average case running time?
– What is:
• A better algorithm?
• The worst case running time for this algorithm?
12
Analysis Example (Phonebook)
• This better algorithm is called Binary Search
• What is its running time?
–
–
–
–
–
–
–
First you look in the middle of n elements
Then you look in the middle of n/2 = ½*n elements
Then you look in the middle of ½ * ½*n elements
…
Continue until there is only 1 element left
Say you did this m times: ½ * ½ * ½* …*n
Then the number of repetitions is the smallest
1
integer m such that
n 1
m
2
13
Analyzing Binary Search
– In the worst case, the number of repetitions is the
smallest integer m such that 1 n  1
2m
– We can rewrite this as follows:
1
n 1
2m
n  2m
Multiply both sides by
log n  m
Take the log of both sides
2m
Since m is the worst case time, the algorithm is O(logn)
14
Analysis Example
“prefix averages”
You want this mapping from array of numbers to an
array of averages of the preceding numbers (who
knows why – not my example):
5
10
15
20
25
30
5/1
15/2
30/3
50/4
75/5
105/6
There are two straightforward algorithms:
One is easy but wasteful.
The other is more efficient, but requires
insight into the problem.
Adapted from Goodrich & Tamassia
15
Analysis Example
Adapted from Goodrich & Tamassia
16
Analysis Example
• For each position i in A, you look at the values
for all the elements that came before
–
–
–
–
–
–
–
What is the number of positions in the largest part?
When i=n, you look at n positions
When i=n-1, you look at n-1 positions
When i=n-2, you look at n-2 positions
…
When i=2, you look at 2 positions
When i=1, you look at 1 position
• This should look familiar …
17
Analysis Example
A useful tool: store partial information in a variable!
Uses space to save time. The key – don’t divide s.
Eliminates one for loop – always a good thing to do.
Adapted from Goodrich & Tamassia
18
Summary: Analysis of Algorithms
• A method for determining, in an abstract way,
the asymptotic running time of an algorithm
– Here asymptotic means as n gets very large
• Useful for comparing algorithms
• Useful also for determing tractability
– Meaning, a way to determine if the problem is
intractable (impossible) or not
– Exponential time algorithms are usually intractable.
• We’ll revisit these ideas throughout the rest of
the course.
19
Next Time
• Stacks and Queues
20