Matters of Time & Space

Download Report

Transcript Matters of Time & Space

Matters of Time & Space
• Today: Time & Space Complexity
• Monday
– Robot Research & Competitions
The goal of the Smart Dust Project is
to build a self-contained, millimeterscale sensing and communication
platform for a massively distributed
sensor network.
This device would be around the size
of a grain of sand and will contain
sensors, computational ability, bidirectional wireless communications,
and a power supply.
Computational Resources
• Time
– How much time does it take for the program to run on
a particular problem?
• Space
– How much memory space does it take for the
program to run on a particular problem?
• Processors are getting faster and memory is
getting cheaper so why worry?
– Limited resources in embedded computing like cars,
cell phones, sensor networks, and robots
– Real-time constraints
Algorithm Analysis
• Compare running times and space
requirements independent of programming
languages, compiler, hardware, processor
speed, …
• Algorithm Analysis
– Measure the efficiency of an algorithm or
program as the size of the input becomes
large
– Provides a gross comparison of algorithms
Time & Space Complexity
• Time Complexity
– The way in which the number of steps
required by an algorithm varies with the size
of the problem it is solving.
• Space Complexity
– The way in which the amount of storage
space required by an algorithm varies with the
size of the problem it is solving.
• What is meant by “steps” and “size”?
Steps
• Basic operation
– An algorithm step or program code that has the property
that its time to complete does not depend on the
particular values of its operands
totalError = totalError + currentError;
for (i = 0; i < roomLength; i++)
for (j = 0; j < roomWidth; j++)
{
map(room[i,j]);
…
}
Size
• Number of inputs processed
• n
int sumIntArray(int arr[ ], int sizeOfarr)
{
int i, total;
for (i=0; i < sizeOfarr; i++)
total = total + arr[i];
return total;
}
Running Time
• Let us say that c is the amount of time it takes to
access and add one integer
• Then we can say that sumIntArray has a “running
time” of T(n) = cn where n = sizeOfarr
int sumIntArray(int arr[ ], int sizeOfarr)
{
int i, total;
for (i=0; i < sizeOfarr; i++)
total = total + arr[i];
return total;
}
Comparing Functions
• Given two functions there is usually points
where one function is smaller than the
other. So it does not make sense to say:
f(n) < g(n)
• Instead we can compare their relative
rates of growth
– How much resource does an algorithm take
as n grows larger.
Comparing Functions
• Example
f(n) = 1000n and g(n)= n2
f(n) >g(n) for small values of n
n2 grows at a faster rate, so will eventually
be larger than 1000n
The turning point is n=1000
Growth Rate
•
•
•
•
Linear growth rate
Quadratic growth rate
Exponential growth rate
Logarithmic growth rate
Definition: “order of”
• T(N) = O(f(n)) if there is a postive constant c
and n0 (a point) such that T(N) <= c(f(n))
when N >= n0
• Eventually there is some point, n0, past
c*f(N) that is at least as large as T(N).
• So,if constants are ignored, f(n) is at least as
big as T(N)
Big-O
• Example
T(N) = 1000N
f(N) = N2
n0 = 1000
c=1
Thus:
1000N = O(N2)
We say “T(N) is order
N-squared”
Or
for short hand “T(N) is
Big-O of N-squared”
Big-O
• When we say T(N) = O(f(N) we are
guaranteeing that the function T(N) grows
no faster than f(n)
• Thus f(N) is an upper bound on T(N)
Big-O Represents An Upper Bound
If T(n) is O(f(n)), then f(n) is basically a cap on how bad
T(n) will behave when n gets big.
g(n)
v(n)
r(n)
p(n)
y(n)
b(n)
Is g(n) O(r(n))?
Is v(n) O(y(n))?
Is b(n) O(p(n))?
Is r(n) O(g(n))?
Is y(n) O(v(n))?
Is p(n) O(b(n))?
Thanks to Dr. White for the slide
Computational Model For Algorithm Analysis
To formally analyze the performance of algorithms, we will
use a computational model with a couple of simplifying
assumptions:
– Each simple instruction (assignment, comparison,
addition, multiplication, memory access, etc.) is
assumed to execute in a single time unit.
The size of the input, n, will normally be used as our main
variable, and we’ll primarily be interested in “worst case”
scenarios.
Thanks to Dr. White for the slide
General Rules For Running Time Calculation
Rule One: Loops
The running time of a loop is at most the running time of the statements
inside the loop, multiplied by the number of iterations.
Example:
for (i = 0; i < n; i++)
// n iterations
A[i] = (1-t)*X[i] + t*Y[i]; // 12 time units
// per iteration
(Retrieving X[i] requires one addition and one memory access, as
does retrieving Y[i]; the calculation involves a subtraction, two
multiplications, and an addition; assigning A[i] the resulting value
requires one addition and one memory access; and each loop
iteration requires a comparison and either an assignment or an
increment. This totals twelve primitive operations.)
Thus, T(N) = 12n time units, i.e., this part
of the program is O(n).
Thanks to Dr. White for the slide
Rule Two: Nested Loops
The running time of a nested loop is at most the running time
of the statements inside the innermost loop, multiplied by the
product of the number of iterations of all of the loops.
Example:
for (i = 0; i < n; i++)
// n iterations. 2 ops each
for (j = 0; j < n; j++)
// n iterations, 2 ops each
C[i,j] = j*A[i] + i*B[j]; // 10 time units/iteration
(2 for retrieving A[i], 2 for retrieving B[j], 3 for the RHS arithmetic, 3 for assigning C[i,j].)
T(N) = ((10+2)n+2)n = 12n2+2n time units, which is O(n2).
Thanks to Dr. White for the slide
Rule Three: Consecutive Statements
The running time of a sequence of statements is merely the sum of
the running times of the individual statements.
Example:
for (i = 0; i < n; i++)
{
A[i] = (1-t)*X[i] + t*Y[i];
B[i] = (1-s)*X[i] + s*Y[i];
}
for (i = 0; i < n; i++)
for (j = 0; j < n; j++)
C[i,j] = j*A[i] + i*B[j];
// 22n time units
// for this
// entire loop
// (12n+2)n time
// units for this
// nested loop
T(N)= 12n2+24n time units, i.e., this code is O(n2).
Thanks to Dr. White for the slide
Rule Four: Conditional Statements
The running time of an if-else statement is at most the
running time of the conditional test, added to the maximum
of the running times of the if and else blocks of statements.
Example:
if (amt > cost + tax)
{
count = 0;
while ((count<n) && (amt>cost+tax))
{
amt -= (cost + tax);
count++;
}
cout << “CAPACITY:” << count;
}
else
cout << “INSUFFICIENT FUNDS”;
//2 time units
//1 time unit
//4 TUs per iter
//At most n iter
//3 time units
//2 time units
//2 time units
//1 time unit
Total running time: 2 + max(1 + (4 + 3 + 2)n + 2, 1) =
9n + 5 time units, i.e., this code is O(n).
Thanks to Dr. White for the slide
Analysis of Breadth First Search
create root node;
put root node in list;
while (solution not found) or (list is not empty) do
take first node off of list;
if node = solution
set solution found to true;
return node;
else
for each possible action
generate child node
put child node on the end of list
return null
Analysis of Breadth First Search
• Assume the branching factor is 2
– Branching factor: number of children per parent
• Number of nodes expanded:
1 + 2 + 4 + 8 + ….+2d, where d is the depth of the tree
– O(2d)
• What would be the Big-O of a breadth first search
with any number of children?
– O(bd)
• What is the space complexity of breadth first
search?
– If the solution path is needed: O(bd)
Analysis of Depth First Search
create root node;
put root node in list;
while (solution not found) or (list is not empty do)
take first node off of list;
if node = solution
set solution found to true;
return node;
else
for each possible action
generate child node
put child node on the start of list
return null
Analysis of Depth First Search
Analysis of Depth First Search
• Assume the branching factor is 2
– O(2d)
• What would be the Big-O of a depth first
search with any number of children?
– O(bd)
• What is the space complexity of depth first
search?
– If only the solution path is needed: O(bd)
Algorithm Analysis Questions
• What is the Big-O of wave front path
planning?
• What is the Big-O of thresholding an
image to find a specific color blob?