PPT (pre) - School of Computer Science

Download Report

Transcript PPT (pre) - School of Computer Science

Great Theoretical Ideas In Computer Science
S. Rudich
V. Adamchik
CS 15-251
Lecture 17
March 21, 2006
Spring 2006
Carnegie Mellon University
On Time Versus Input Size
t
i
m
e
# of bits
Welcome to the 9th week
1. Time complexity of algorithms
2. Space complexity of songs
How to add 2 n-bit numbers.
+
* * * * * * * * * * *
* * * * * * * * * * *
How to add 2 n-bit numbers.
+
*
* * * * * * * * * * *
* * * * * * * * * * *
*
How to add 2 n-bit numbers.
+
* *
* * * * * * * * * * *
* * * * * * * * * * *
* *
How to add 2 n-bit numbers.
+
* * *
* * * * * * * * * * *
* * * * * * * * * * *
* * *
How to add 2 n-bit numbers.
+
* * * *
* * * * * * * * * * *
* * * * * * * * * * *
* * * *
How to add 2 n-bit numbers.
+
* * * * * * * * * * *
* * * * * * * * * * *
* * * * * * * * * * *
* * * * * * * * * * * *
“Grade school addition”
Time complexity of
grade school addition
**********
*
*
*
*
*
*
*
*
*
*
+ **********
***********
T(n) = amount of time
grade school addition
uses to add two n-bit
numbers
What do you
mean by
“time”?
Our Goal
We want to define “time” taken by the
method of grade school addition without
depending on the implementation details.
Hold on!
But you agree that T(n) does
depend on the implementation!
A given algorithm will take
different amounts of time on the
same inputs depending on such
factors as:
Processor speed
Instruction set
Disk speed
Brand of compiler
Our Goal
Therefore, we can only speak of the time
taken by any particular implementation, as
opposed to the time taken by the method in
the abstract.
These objections are serious, but
they are not undefeatable.
There is a very nice sense in which
we can analyze grade school addition
without having to worry about
implementation details.
Here is how it works . . .
Grade school addition
On any reasonable computer M, adding 3 bits
and writing down 2 bits (for short
) can
be done in constant time c.
Total time to add two n-bit numbers using
grade school addition: c*n
[c time for each of n columns]
Grade school addition
On another computer M’, the time to
perform
may be c’.
Total time to add two n-bit numbers using
grade school addition: c’*n
[c’ time for each of n columns]
t
i
m
e
# of bits in the numbers
Different machines result in
different slopes, but time
grows linearly as input size
increases. f
Thus we arrive at an
implementation independent
insight:
Grade School Addition is a linear
time algorithm.
Abstraction:
Abstract away the inessential
features of a problem or solution
=
We can define away the details of
the world that we do not wish to
currently study, in order to recognize
the similarities between seemingly
different things…
The process of abstracting
away details and determining
the rate of resource usage
in terms of the problem size
is one of the fundamental
ideas in computer science.
You can measure “time” as
the number of elementary
“steps” defined in any other
way, provided each such “step”
takes constant time
in a reasonable implementation.
Constant: independent of the
length n of the input.
How to multiply 2 n-bit numbers.
X ********
********
n2
********
********
********
********
********
********
********
********
****************
How to multiply 2 n-bit numbers.
X ** ** ** ** ** ** ** **
n2
********
********
********
********
********
********
********
********
****************
The total time is bounded by
c n2 (abstracting away the
implementation details).
Grade School Addition: Linear time
Grade School Multiplication: Quadratic time
t
i
m
e
# of bits in the numbers
No matter how dramatic the difference in the
constants, the quadratic curve will eventually
dominate the linear curve
Time vs. Input Size
For any algorithm, define
n – the input size
and
T(n) - the amount of time used
on inputs of size n
We will ask:
What is the growth rate of T(n) ?
Useful notation to discuss growth rates
For any monotonic functions f, g from the positive
integers to the positive integers, we say
f(n) = O( g(n) )
if
g(n) eventually dominates f(n)
[Formally: there exists a constant c such that for all
sufficiently large n: f(n) ≤ c*g(n) ]
More useful notation: Ω
For any monotonic functions f, g from the positive
integers to the positive integers, we say
f(n) = Ω( g(n) )
if:
f(n) eventually dominates g(n)
[Formally: there exists a constant c such that for all
sufficiently large n: f(n) ≥ c*g(n) ]
Yet more useful notation: Θ
For any monotonic functions f, g from the positive
integers to the positive integers, we say
f(n) = Θ( g(n) )
if:
f(n) = O( g(n) ) and f = Ω( G(n) )
f = Θ(n) means that f can be
sandwiched between two lines
from some point on.
t
i
m
e
# of bits in numbers
• n = O(n2) ?
• n = O(√n) ?
• n2 = Ω(n log n) ?
• 3n2 + 4n +  = Ω(n2) ?
• 3n2 + 4n +  = Θ(n2) ?
• n2 log n = Θ(n2) ?
Quickies
Two statements in one!
n2 log n = Θ(n2)
n2 log n = O(n2)
NO
n2 log n = Ω(n2)
YES
Names For Some Growth Rates
Linear Time: T(n) = O(n)
Quadratic Time: T(n) = O(n2)
Cubic Time: T(n) = O(n3)
Polynomial Time:
for some constant k, T(n) = O(nk).
Example: T(n) = 13n5
Large Growth Rates
Exponential Time:
for some constant k, T(n) = O(kn)
Example: T(n) = n2n = O(3n)
Almost Exponential Time:
kth root of n
for some constant k, T(n) = 2
p
Example: T(n) = 2
n
Small Growth Rates
Logarithmic Time: T(n) = O(logn)
Example: T(n) = 15log2(n)
Polylogarithmic Time:
for some constant k, T(n) = O(logk(n))
Note: These kind of algorithms can’t possibly
read all of their inputs.
Complexity of Songs
Suppose we want to sing a song of length n.
Since n can be large, we want to memorize
songs which require only a small amount of
brain space.
Let S(n) be the space complexity of a song.
Complexity of Songs
The amount of space S(n) can be
measured in either characters of
words.
What is asymptotic relation
between # of characters and the
# of words?
Complexity of Songs
S(n) is the space complexity of a
song.
What is the upper and lower
bounds for S(n)?
Complexity of Songs with a refrain
Let S be a song with m verses of
length V and a refrain of length R.
The refrain is first, last and
between.
What is the space complexity of
S?
100 Bottles of Beer
n bottles of beer on the wall,
n bottles of beer.
You take one down and pass it
around.
n-1 bottles of beer on the wall.
What is the space complexity of
S?
The k Days of Christmas
On the k day of Christmas,
my true love sent to me
gift1 ,…, giftk
What is the space complexity of
this song?
Time complexities
•The worst time complexity
•The best time complexity
•The average time complexity
•The amortized time complexity
•The randomized time complexity
The input size
In sorting and searching (array)
algorithms, the input size is the
number of items.
In graph algorithm the input size
is presented by V+E
In number algorithms, the input
size is the number of bits.
Primality
boolean prime(int n)
{
int k = 2;
while( k++ <= Math.sqrt(n) )
if(n%k == 0) return false;
return true;
}
What is the complexity of this algorithm?
Primality
The number of passes through the loop is n.
Therefore, T(n) = O(n).
But what is n?
The input size
For a given algorithm, the input
size is defined as the number of
characters it takes to write (or
encode) the input.
Primality
The number of passes through the loop is n.
If we use base 10 encoding, it takes
O( log10 n ) = input_size
characters (digits) to encode n.
Therefore,
T(n) = 10^(input_size/2)
The time complexity is not polynomial!
Fibonacci Numbers
public static int fib( int n )
{
int [] f = new int[n+2];
f[0]=0; f[1]=1;
for( int k = 2; k<=n; k++)
f[k] = f[k-1] +f[k-2];
return f[n];
}
What is the complexity of this algorithm?
0-1 Knapsack problem
A thief breaks into a jewelry
store carrying a knapsack.
Each item in the store has a
value and a weight. The
knapsack will break if the
total weight exceeds W. The
thief’s dilemma is to
maximize the total value
What is the time complexity
using the dynamic
programming approach?
Some Big Ones
Doubly Exponential Time means that for
some constant k
kn
2
T(n) = 2
Triply Exponential
kn
2
2
T(n) = 2
And so forth.
Faster and Faster: 2STACK
2STACK(0) = 1
2STACK(n) = 22STACK(n-1)
2STACK(1) = 2
2STACK(2) = 4
2STACK(3) = 16
2STACK(4) = 65536
2STACK(5) ¸ 1080
= atoms in universe
2STACK(n) =
2
2
2
2
2
:::
“tower of n 2’s”
And the inverse of 2STACK: log*
2STACK(0) = 1
2STACK(n) = 22STACK(n-1)
2STACK(1) = 2
2STACK(2) = 4
2STACK(3) = 16
2STACK(4) = 65536
2STACK(5) ¸ 1080
= atoms in universe
log*(n) = # of times you have to apply
the log function to n to make it ≤ 1
And the inverse of 2STACK: log*
2STACK(0) = 1
log*(1) = 0
2STACK(n) = 22STACK(n-1)
2STACK(1) = 2
2STACK(2) = 4
2STACK(3) = 16
2STACK(4) = 65536
log*(2) = 1
log*(4) = 2
log*(16) = 3
log*(65536) = 4
2STACK(5) ¸ 1080
= atoms in universe
log*(atoms) = 5
log*(n) = # of times you have to apply
the log function to n to make it ≤ 1
So an algorithm that
can be shown to run in
O(n log*n) Time
is
Linear Time for all
practical purposes!!
Ackermann’s Function
A(0, n) = n + 1
A(m, 0) = A(m - 1, 1)
A(m, n) = A(m - 1, A(m, n - 1))
for n ≥ 0
for m ≥ 1
for m, n ≥ 1
A(4,2) > # of particles in universe
A(5,2) can’t be written out in this universe
Inverse Ackermann function
A(0, n) = n + 1
A(m, 0) = A(m - 1, 1)
A(m, n) = A(m - 1, A(m, n - 1))
for n ≥ 0
for m ≥ 1
for m, n ≥ 1
Define: A’(k) = A(k,k)
Inverse Ackerman α(n) is the inverse of A’
Practically speaking:
n × α(n) ≤ 4n
The inverse Ackermann
function – in fact,
Θ(n α(n))
arises in the seminal
paper of
D. D. Sleator and R. E.
Tarjan. A data structure
for dynamic trees.
Journal of Computer and
System Sciences,
26(3):362-391, 1983.
Time complexity of
grade school addition
***********
+ ***********
***********
************
T(n) = The amount of
time grade school
addition uses to add
two n-bit numbers
We saw that T(n) was linear.
T(n) = Θ(n)
Time complexity of
grade school multiplication
X
n2
********
********
********
********
********
********
********
********
********
********
****************
T(n) = The amount of
time grade school
multiplication uses to
add two n-bit
numbers
We saw that T(n) was quadratic.
T(n) = Θ(n2)
Next Class
Will Bonzo be able to add two numbers
faster than Θ(n)?
Can Odette ever multiply two numbers
faster than Θ(n2)?
Tune in on Thursday, same time, same place…