Lecture Notes - NAU jan.ucc.nau.edu web server

Download Report

Transcript Lecture Notes - NAU jan.ucc.nau.edu web server

Lecture 2: Math Review and
Asymptotic Analysis
Common Math Functions
• Floors and Ceilings:
x-1 < └x┘< x <┌x┐< x+1.
• Modular Arithmetic:
a mod n = a – └a/n ┘ n.
• Factorials:
n! = 1 if n = 0
n! = n * (n-1)! If n > 0.
n! < nn
Exponentials
•
•
•
•
•
•
a0 = 1
a1 = a
a-1 = 1/a
(am)n = amn
aman = am+n
00 = 1 ..when convenient
Logarithm Rules
•
•
•
•
•
•
•
•
•
•
•
•
lg(n) = log2(n)
ln(n) = loge(n)
lgk(n) = (lg(n))k
lglg(n) = lg(lg(n))
logb(mn) = logb(m) + logb(n)
logb(m/n) = logb(m) – logb(n)
logb(mn) = n · logb(m)
logb(x) = logd(x) / logd(b)
ln(x) = logd(x) / logd(e)
log(1) = 0, lg(1) = 0
log(10) = 1, lg(2) = 1
log(100) = 2, lg(4) = 2
Binary logarithm
Natural logarithm
Exponentiation
Composition
Change of Base Rule
Summation Formulas
• Arithmetic series:
n
 k  1  2  ...  n 
k 1
• Geometric series:
– Special case if x < 1:
• Harmonic series:
• Other:
n ( n  1)
2
x n 1  1
x  1  x  x  ...  x 
x  1

x

1
k 0
n
k

x
k 0
k
2

n
1
1 x
n
1
1
1
 ln n

1


...


k
2
n
k 1
n
 lg k  n lg n
k 1
n
 k p  1p  2 p  ...  n p 
k 1
1
n p 1
p 1
Fibonacci Numbers
• F0 = 0
• F1 = 1
• Fi = Fi-1 + Fi-2
Proofs
• A proof is a logical argument that, assuming
certain axioms, some statement is necessarily
true.
• A few common proof techniques:
–
–
–
–
–
–
Direct Proof
Proof by Induction
Proof by Contradiction
Proof by Contraposition
Proof by Exhaustion
Proof by Counterexample
Direct Proof
• A conclusion is established by logically
combining earlier definitions, axioms and
theorems
Direct Proof: Example
•
•
Pythagorean Theorem:
For a right triangle, with
sides (legs) a and b, and
hypotenuse c, c²=a²+b².
B
x
X
c
a
C
b
Proof (courtesy Legendre):
1. ABC, CBX, and ACX are similar triangles.
2. Corresponding parts of similar triangles are
proportional:
a/x=c/a → a²=cx
b/(c-x)=c/b → b²=c²-cx → c²=cx+b²
3. Substituting a² for cx, we find c²=a²+b².
A
Proof by Induction
• A base case is proven, and an induction
rule used to prove an series (possibly
infinite) of other cases.
Proof by Induction: Example
•
Theorem: 1 + 2 + … + n = n(n+1) / 2
•
Proof (courtesty Gauss):
1. Base Case: If n = 0, 0 = 0(0+1) / 2.
2. Inductive Hypothesis: Assume the statement is true
for n = m, i.e.,1 + 2 + … + m = m(m+1) / 2
3. Inductive Step: Show that n = m + 1 holds:
1 + 2 + … + m + m + 1 = (m+1)(m+1+1) / 2
Since we know the theorem holds for n=m, we
subtract those terms from each side..
m + 1 = (m2 + 3m + 2) / 2 - (m2 + m) / 2
m + 1 = (2m + 2)/2
Proof by Contradiction
• One shows that if a statement were false,
a logical contradiction occurs, and thus the
statement must be true.
Proof by Contradiction: Example
• Theorem: There exists an infinite number of prime
numbers.
• Proof (courtesy of Euclid):
1. Assume that there are a finite number of primes.
2. Then there is a largest prime, p. Consider the number q =
(2x3x5x7x...xp)+1. q is one more than the product of all primes
up to p. q > p. And, q is not divisible by any prime up to p. For
example, it is not divisible by 7, as it is one more than a multiple
of 7.
3. If q is not a prime and it is not divisible by any prime <= p, then it
must be divisible by a prime > p.
4. That is a contradiction, as p was assumed to be the largest
prime. So, there is no largest prime. In other words, there are
infinitely many primes.
Proof by Contraposition
• In order to prove A→B, prove ¬B → ¬A.
Proof by Contraposition
•
•
Theorem: For all natural numbers: If n2 is
even, then n is even as well.
Proof:
1. Contraposition: If n is odd, then n2 is odd.
2. Let n = 2q+1, where q is a natural number.
Then n2 = (2q+1)2 = 4q2+4q+1 =
2(2q2+2q)+1.
3. Let p = 2q2+2q, then n2 = 2p+1. Since 2p
must be even, n2 must be odd proving the
contraposition.
Proof by Exhaustion
• The conclusion is established by dividing
the problem into a finite number of
exhaustive cases and proving each one
separately.
Proof by Exhaustion: Example
•
Theorem: Every cube number is either a multiple of 9 or
is 1 more or 1 less than a multiple of 9.
•
Proof:
Each cube number is the cube of an integer n. n is either a multiple
of 3, or is 1 more or 1 less than a multiple of 3. The following 3
cases are exhaustive:
1. Case 1: If n is a multiple of 3 then the cube of n is a multiple of
27, and so certainly a multiple of 9.
2. Case 2: If n is 1 more than a multiple of 3 then the cube of n is 1
more than a multiple of 9.
3. Case 3: If n is 1 less than a multiple of 3 then the cube of n is 1
less than a multiple of 9.
Proof by Counterexample
• Given an assertion of the form : For all x,
P(x) is true, disprove it by showing that
there is a c such that ¬P(c) is true.
Proof by Counterexample: Example
• Conjecture: The square root of every
integer is irrational.
• Counterexample: Given the function f(x) =
√n, let n = 4. Clearly, √4 = 2 and 2 is
rational.
Loop Invariants
Another proof technique for proving correctness for
loops is to prove these properties hold for a loop
invariant (we saw an example last time):
• Initialization: It is true prior to the first iteration.
• Maintenance: It is true before an iteration of the
loop and remains true before the next iteration.
• Termination: When the loop terminates, the
invariant yields a useful property which helps
show the algorithm is correct.
Loop Invariants: Example
• Show that the following pseudocode
correctly determines if x is in the array A:
for i=0 to len(A):
if (x == A[i]) return true
return false
• Loop Invariant: At the start of each for
loop, the subsequence A[0]..A[i-1] does
not contain the variable x.
Loop Invariants: Example
• Initialization: The loop invariant is true at initialization
because the subsequence A[0]..A[-1] is an empty
subsequence and by definition x is not contained in the
subsequence.
• Maintenance: As we consider the (k-1)st element of A, if
it is in A we return true. If it is not we continue. Therefore
if we consider the kth element the (k-1)st element must
not have been equal to x and subsequently any earlier
element.
• Termination: The loop terminates when i=len(A) or when
x is found in the array. From the maintenance property
we know that if i=len(A), then A[0]..A[len(A)-1] does not
contain x. If i != len(A) then the loop terminated because
x was found.
Danger: Proof by Example
• Conjecture: For arbitrary sets A and B, the sets
A \ B, B \ A, and A ∩ B are pair wise disjoint.
• Proof:
Let A = {1,3,5,7} and B = {2,4,6,8}.
Then ...
NO! In general, an example is almost never a
proof. * Not to be confused with proof by
construction (which you are unlikely to use in
this class).
Danger: Proof by Example
• Conjecture: The Fibonacci sequence, F(x),
is always odd.
• Proof: Consider x=2; F(2)=1. Or consider
x=10; F(10)=55, etc.. Clearly the
Fibonacci sequence is always odd.
NO! Even if the conjecture were true this
is not a correct proof.
Growth of Functions Review
•
•
•
•
•
•
•
1
log n
n
n log n
n2
n3
2n
Constant
Logarithmic
Linear
Quadratic
Cubic
Exponential
Asymptotic Notation
How do we describe asymptotic efficiency?
• O(g(n)): g(n) is an asymptotic upper bound
• o(g(n)): g(n) is an upper bound that is not
asymptotically tight
• Ω(g(n)): g(n) is an asymptotic lower bound
• ω(g(n)): g(n) is a lower bound that is not
asymptotically tight
• Θ(g(n)): g(n) is an asymptotically tight
bound
O-Notation
• O(g(n)) is pronounced “Big-oh of g of n” or
sometimes just “Oh of g of n”
• We use this notation when we can only
find an upper bound on a function.
• O(g(n)) = { f(n): there exists some positive
constants c and n0 such that:
0 < f(n) < cg(n) for all n > n0}.
O-Notation Example
•
•
Prove: ½ n2 – 3n = O(n2)
Proof:
1. We must choose c and n0 s.t.:
½ n2 – 3n < cn2 for all n > n0.
2. Dividing by n2 yields:
½ – 3/n < c
3. This holds if we choose c > ½ and n0 > 7.
o-Notation
• o(g(n)) is pronounced “Little-oh of g of n”
• O(g(n)) may or may not be tight. We use
o(g(n)) to specify that it is not
asymptotically tight.
• o(g(n)) = { f(n): for any positive constant
c>0 there exists a positive constant n0>0
such that: 0 < f(n) < cg(n) for all n > n0}.
o-Notation Example
• For example, 2n = o(n2) because n2 is not
an asymptotically tight upper bound for 2n.
It is, however, an upper bound.
• But, 2n2 ≠ o(n2) because n2 would be an
asymptotically tight upper bound.
o-Notation Example
•
•
Prove: 2n = o(n2)
Proof:
1. Let n0 = 3/c.
2. Substituting n0 for n in 2n < cn2:
2(3/c) < c(3/c)2
6/c < 9/c.
3. Clearly the o-Notation definition holds.
Ω-Notation
• Ω(g(n)) is pronounced “Big-omega of g of
n” or sometimes just “Omega of g of n”
• We use this notation when we can only
find a lower bound on a function.
• Ω(g(n)) = { f(n): there exists some positive
constants c and n0 such that:
0 < cg(n) < f(n) for all n > n0}.
Ω-Notation Example
•
•
Prove: ½ n2 – 3n = Ω (n2)
Proof:
1. We must choose c and n0 s.t.:
cn2 < ½ n2 – 3n for all n > n0.
2. Dividing by n2 yields:
c < ½ – 3/n
3. This holds if we choose c > 1/14 and n0 > 7.
ω-Notation
• ω(g(n)) is pronounced “Little-omega of g of
n”
• Ω(g(n)) may or may not be tight. We use
ω(g(n)) to specify that it is not
asymptotically tight.
• ω(g(n)) = { f(n): for any positive constant
c>0 there exists a positive constant n0>0
such that: 0 < cg(n) < f(n) for all n > n0}.
ω-Notation Example
• For example, n2/2 = ω(n) because n is not
an asymptotically tight lower bound for
n2/2. It is, however, a lower bound.
• But, n2/2 ≠ ω (n2) because n2 would be an
asymptotically tight upper bound.
ω-Notation Example
•
•
Prove: n2/2 = ω(n)
Proof:
1. Let n0 = 3c.
2. Substituting n0 for n in cn < n2/2
c(3c) < (3c)2/2
6c2 < 9c2
3. Clearly the ω-Notation definition holds.
Θ - Notation
• Θ(g(n)) is pronounced “Theta of g of n”
• We use this notation when g(n) provides
an upper and lower bound for a function.
That is, it is asymptotically tight. This is a
stronger statement.
• Θ(g(n)) = { f(n): there exists some positive
constants c1, c2 and n0 such that:
0 < c1g(n) < f(n) < c2g(n) for all n > n0}.
Θ - Notation
• You may notice that f(n) = Θ(g(n)) implies
f(n) = O(g(n)) and f(n) = Ω(g(n)). The
converse is not true.
• f(n) = Θ(g(n)) sandwhiches f(n) between
an upper and lower bound.
Θ -Notation Example
•
•
Prove: ½ n2 – 3n = Θ(n2)
Proof:
1. We must choose c and n0 s.t.:
c1n2 < ½ n2 – 3n < c2n2 for all n > n0.
2. Dividing by n2 yields:
c1 < ½ – 3/n < c2
3. This holds if we choose c1 > 1/14 and c2 > ½
and n0 > 7.
Θ – Notation Example
•
•
Prove: ½ n2 – 3n = Θ(n2)
Alternate Proof:
1. We already proved ½ n2 – 3n = O(n2) and
that ½ n2 – 3n = Ω(n2), then clearly the n2
bound on ½ n2 – 3n is asymptotically tight
and ½ n2 – 3n = Θ(n2).
Asymptotic Notation in Equations
and Inequalities
• 2n2 + 3n + 1 = 2n2 + Θ(n) means
2n2 + 3n + 1 = 2n2 + f(n) where f(n) is a
function in the set Θ(n).
• We use this to impart meaning and
remove clutter:
2n2 + 3n + 1 =
2n2 + Θ(n) =
Θ(n2)
Transitivity
•
•
•
•
•
f(n) = O(g(n)) && g(n) = O(h(n)) → f(n) = O(h(n)).
f(n) = o(g(n)) && g(n) = o(h(n)) → f(n) = o(h(n)).
f(n) = Ω(g(n)) && g(n) = Ω(h(n)) → f(n) = Ω(h(n)).
f(n) = ω(g(n)) && g(n) = ω(h(n)) → f(n) = ω(h(n)).
f(n) = Θ(g(n)) && g(n) = Θ(h(n)) → f(n) = Θ(h(n)).
Reflexivity
• f(n) = O(f(n)),
• f(n) = Ω(f(n)),
• f(n) = Θ(f(n))
But NOT:
• f(n) = o(f(n)),
• f(n) = ω(f(n)).
Symmetry
• f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n)).
Does not hold for O, o, Ω, or ω but the
following transpose symmetries do:
• f(n) = O(g(n)) if and only if g(n) = Ω(f(n)).
• f(n) = o(g(n)) if and only if g(n) = ω(f(n)).
An Analogy
The comparison of functions roughly analogizes to
the comparison of real numbers:
•
•
•
•
•
f(n) = O(g(n))
f(n) = o(g(n))
f(n) = Ω(g(n))
f(n) = ω(g(n))
f(n) = Θ(g(n))
…
…
…
…
…
a<b
a<b
a>b
a>b
a=b
Insertion Sort.. again
• Last time we came up with this running
time for insertion sort:
T(n) = (c4+c5+c6)n2/2 + (c1+c2+c3+c4/2c5/2-c6/2c7)n – (c2 + c3 + c4 + c7)
• With some hand waving I claimed that
T(n) = Θ(n2)
• Prove more formally this is true.
Simplification: T(n) = dn2 + en – f where d,
e and f are constants.
A Warm Up Exercise
• Show for any real constants a and b > 0,
(n + a)b = Θ(nb).
• Recall: Θ(g(n)) = { f(n): there exists some
positive constants c1, c2 and n0 such that:
0 < c1g(n) < f(n) < c2g(n) for all n > n0}.
A Warm Up Exercise
•
•
Show for any real constants a and b > 0,
(n + a)b = Θ(nb).
Proof:
1.
2.
3.
4.
5.
c1nb < (n + a)b < c2nb
(n+a)b = nb(1+a)b
Let c1 = 1
Let c2 = (2+a)b
Let n0 = 1.
Insertion Sort.. Again
• Prove: T(n) = dn2 + en – f
• Recall: Θ(g(n)) = { f(n): there exists some
positive constants c1, c2 and n0 such that:
0 < c1g(n) < f(n) < c2g(n) for all n > n0}.
Insertion Sort.. again
• Prove: T(n) = dn2 + en – f = Θ(n2)
• Proof:
c1n2 < dn2 + en – f < c2n2
Let n0 = f*max(1,1/d)*max(1,1/e)
Note: min(d,e)n2 < dn2 + en – f
Let c1 = min(d,e).
Note: dn2 + en – f < (d+e)n2
Let c2 = (d+e).
More Examples
•
Prove: lg(n!) = Θ(n lg n)
More Examples
•
Prove: lg(n!) = O(n lg n)
•
Proof:
1.
2.
3.
4.
5.
n! < nn
lg(nn) = n lg (n)
lg(nn) > lg(n!)
n lg (n) > lg(n!)
lg(n!) = O(n lg n)