Algorithm Analysis - Arizona State University

Download Report

Transcript Algorithm Analysis - Arizona State University

Algorithm Analysis
What is the Big ‘O Bout?
Anshuman Razdan
Div of Computing Studies
[email protected]
http://dcst2.east.asu.edu/~razdan/cst230/
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
1
Algorithm
• An Algorithm is a procedure or sequence of
instructions for solving a problem. Any
algorithm may be expressed in many
different ways: In English, In a particular
programming language, or (most
commonly) in a mixture of English and
programming called pseudocode.
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
2
Example
• For each item
do something
if something greater than nothing
return true
else
return false
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
3
Analyzing Algorithms




Questions: Is this algorithm fast enough?
How much memory will I need to implement
this solution? Is there a better solution?
Analysis is useful BEFORE any
implementation is done to avoid wasting
time on an inappropriate solution.
Algorithm analysis involves predicting
resources (time, space) required
Analyze algorithms independent of
machines, implementation, etc.
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
4
Run-time Analysis




Number of operations (amount of time) required
for the algorithm to terminate
The number of operations typically depends on the
program’s input size. E.g., sorting 1000 numbers
takes longer (performs more operations) than
sorting 3 numbers.
running time = function(input size)
Input size typically expressed as “N” or “n”
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
5
Space Analysis
• Amount of storage (memory) required for
the program to execute correctly.
• Again, the amount of storage will often
depend on the input size.
• Space complexity = function(input size)
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
6
Example – time complexity
Insertion-Sort(A)
1
for j := 2 to length[A]
2
do key := A[j]
3
/* insert A[j] into sorted A[1..j-1] */
4
i := j -1
5
while i > 0 and A[i] > key
6
do A[i+1] := A[i]
7
i := i - 1
8
A[i+1] := key
Count number of operations for inputs:
A = [3 5 2 8 3]
A = [2 3 3 5 8]
n(4 + 3(n-1)/2) = (3n^2 + 5n)/2 where n is the size of A
http://dcst2.east.asu.edu/~razdan/cst230/
O(n^2)
Razdan with contribution from others
7
Algorithm Complexity
• Worst case running time is an upper bound
on running time for any input of size n
• Best case is often not very useful
• Average case: average over all possible
inputs. Often same complexity as worst
case.
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
8
Stair counting problem
• Technique 1 – Keep a tally = 3n ops
• Technique 2 – Let someone else keep count
= n+2(1+2+…+n)
• Technique 3 – Ask = 4 ops
• 2689 steps
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
9
Num Ops (Stair Counting)
• T1 : 3n
• T2 : n2 + 2n
• T3 : log10n + 1 where n is rounded down
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
10
Stair Counting
•
•
•
•
2689 stairs
T1 : 8067 operations
T2 : 7,236,099 operations
T3 : 4 operations
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
11
Number of Ops in Three
Techniques
Stairs
10
100
1000
10000
O10(log n)
2
3
4
5
O(n)
30
300
3000
30000
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
O(n2)
120
10,200
1,002,000
100,020,000
12
Factoid
• Order of the algorithm is generally more
important then the speed of the processor
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
13
Search Algorithm
• for (i=0); i < data.length; i++)
if data[i] == target
return true;
What is the?
Best Case
Worst Case
Average
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
14
Worst Case
• Usually we treat only the worst case
performance:
The worst case occurs fairly often, example:
looking for in entry in a database that is not
present.
• The result of the worst case analysis is often
not different to the average case analysis
(same order of complexity).
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
15
Asymptotic Notation
• If we want to treat large problems (these are the critical
ones), we are interested in the asymptotic behavior of the
growth of the running time function. Thus, when
comparing the running times of two algorithms:
Constant factors can be ignored.
Lower order terms are unimportant when the higher order
terms are different.
• For instance, when we analyze selection sort, we find that
it takes T(n) = n2 + 3n - 4 array accesses.
For large values of n, the 3n - 4 part is insignificant
compared to the n2 part.
• An algorithm that takes a time of 100n2 will still be faster
than an algorithm that is n3 for any value of n larger than
100.
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
16
Big-O Definition


O(g(n)) = {f(n) : there exist positive constants c
and n0, such that 0  f(n)  c g(n) for all n  n0}
|f(n)|  c|g(n)| (use this to prove big-O).
c g(n)
f(n)
n0
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
17
Big O Naming Convention
•
•
•
•
•
If a function is O(log(n)) we call it
logarithmic.
If a function is O(n) we call it linear.
If a function is O(n2) we call it quadratic.
If a function is O(nk) with k≥1, we call it
polynomial.
If a function is O(an) with a>1, we call it
exponential.
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
18
Asymptotic Notation
, O, , o, 
•  -- tight bound
– When a problem is solved in theoretical min time
•
•
•
•
O -- upper bound
 -- lower bound
o -- non-tight upper bound
-- non-tight lower bound
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
19
Big  Definition

(g(n)) = {f(n) : there exist positive constants c
and n0, such that 0  c g(n)  f(n) for all n  n0}
f(n)
c g(n)
n0
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
20
 Definition
( g(n) ) = { f(n) : there exist positive constants c1,
c2, and n0, such that 0  c1 g(n)  f(n)  c2 g(n) for
all n  n0 }
c2 g(n)
f(n)
c1 g(n)
n0
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
21
Theorem
• For any two functions f(n) and g(n), we have f(n)
= ( g(n) ) iff f(n) = O(g(n) ) and f(n) = (g(n))
c g(n)
f(n)
f(n)
c g(n)
n0
n0
c2 g(n)
f(n)
c1 g(n)
n0
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
22
• If lim n-> inf f(n)/g(n) = 0, f(n) = O(g(n))
• If lim n-> inf f(n)/g(n) = inf, f(n) = (g(n))
• If lin n-> inf f(n)/g(n) = c, f(n) = (g(n))
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
23
o
and  Definitions
• f(n) = o( g(n) ) if g(n) grows lot faster than
f(n)
lim
n 
f ( n)
0
g ( n)
• f(n) = ( g(n) ) if f(n) grows a lot faster than
g(n)
lim
n 
f ( n)

g ( n)
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
24
Prove
6n  (n )
3
2
(1/2)n2 - 3n = (n2)
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
25
Insertion Sort
• Link to Insertion Sort
http://dcst2.east.asu.edu/~razdan/cst230/
Razdan with contribution from others
26