BIG-O --- Algorithms
Download
Report
Transcript BIG-O --- Algorithms
BIG-O --- Algorithms
Purpose: Be able to evaluate the
relative efficiency of various
algorithms that are used to process
data
We need to be able to evaluate all of
the major sort and search
algorithms as well as the various
implementations of the Abstract
Data Types
1
Resources: Java Methods Data
Structures Chapter 8 p.191
2
Intro: As we began to discuss in the
lecture on Algorithms, we need to be
able to evaluate processes and
algorithms based on a uniform
criteria.
Big-O provides us with a method for
such evaluations
3
BIG-O --- Algorithms
Searching (locating an element with
a target value in a list of values) and
Sorting are 2 tasks used to illustrate
the concept of an algorithm
Algorithms typically use iterations or
recursion. This property
differentiates it from straight
forward code
4
Algorithms are also generic in
nature. The same algorithm applies
to a whole set of initial states and
produces corresponding final states
for each of them.
A sorting algorithm, for example,
must apply to any list regardless of
the values of its elements
5
Furthermore, an algorithm must be
independent of the SIZE of the task
(n)
6
Analyze the time efficiency and
space requirements of algorithms in
an abstract way by looking at an
algorithm with regards to specific
data types and other implementation
details
7
Criteria to evaluate an algorithm:
Space required
Amount of time
Complexity
8
SPACE:
Number and size of simple variables
Number and total size of components of
compound variable
Is space dependent on size of input?
9
TIME:
Not necessarily measured in real clock
time
Look for operation(comparison)
Express time required in terms of this
characteristic operation
10
COMPLEXITY:
Has little to do with how complex an
algorithm “looks”
function of the size(number) of input values
Average time
based on probability of the occurrence of inputs
Worst time
based on most unfavorable input
11
We will focus on time efficiency
12
Efficiency of TIME:
Number of Comparisons (if a >
b)
Number of Assignments (a = c)
13
We will also disregard the specifics of
the hardware or the programming
language so:
we will not be measuring time efficiency in
real time
we will not measure in terms of the
number of required program
instructions/statements
14
We will discuss performance in terms
of abstract “steps” necessary to
complete a task and we assume that
each step takes the same amount of
time
We can compare different algorithms
that accomplish the same task
15
The theory that we can predict the
long-term behavior of functions
without specific knowledge of the
exact constants used in describing
the function allows us to ignore
constant factors in the analysis of
execution time
16
BIG-O --- Big-O Notation
We can evaluate algorithms in terms of
Best Case , Worst Case and Average
Case
We assume that the number of “steps”
or n is a large number
Analyze loops especially nested loops
17
Sequential Search algorithms grow
linearly with the size (number) of the
elements
We match the target value against each
array value until a match is found or
the entire array is scanned
18
Sequential Search
Worst Case is we locate the target
element in the last element
Best Case is the target value is found
on the first attempt
Average Case finds the target value in
the middle of the array
19
Binary Search algorithms (compared as
applied to the same task as the
sequential search) grow logarithmicly
with the size (number) of the elements
Elements must be ordered
We compare the target value against
the middle element of the array and
proceed left (if smaller) or right
(larger) until a match if found
20
Binary Search
For example, where n=7, we try a[3]
then proceed left or right
21
Binary Search
Worst Case , Average Case is where we
locate the target element in (3 ) log n
comparisons (the average case is only
one less than the worst case)
Best Case is the target value is found
on the first attempt
THE execution time of a Binary Search
is approx. proportional to the log of n
22
(A) Linear Growth
(B) Logarithmic Growth
Logarithmic Growth is SLOWER than Linear Growth
Asymptotically, a Binary Search is FASTER than a
Sequential Search as Linear Time eventually
surpasses Logarithmic Time
(A)
(B)
23
Big-O represents the Order of
Growth for an Algorithm
24
Growth Rate Functions:
reference functions used to
compare the rates of growth of
algorithms
25
BIG-O --- Big-O Notation
Growth Rate Functions
O(1) Constant Time --- time
required to process 1 set of steps
The algorithm requires the same
fixed number of steps regardless of
the size of the task:
Push and Pop Stack operations
Insert and Remove Queue operations
Finding the median value in a sorted
Array
26
O(n) Linear Time --- increase in
time is constant for a larger n
(number of tasks)
The algorithm requires a number of
steps proportional to the size of the
task
20 tasks = 20
27
O(n)
Examples:
Traversal of a List
Finding min or max element in a List
Finding min or max element in a
sequential search of unsorted
elements
Traversing a Tree with n nodes
Calculating n-factorial, iterativly
Calculating nth Fibonacci number,
iteratively
28
O(n^2) Quadratic Time
The number of operations is
proportional to the size of the task
SQUARED
20 tasks = 400
29
O(n^2) Examples:
Simplistic Sorting algorithms such
as a Selection Sort of n elements
Comparing 2 2-Dimensional
arrays, matrics, of size n by n
Finding Duplicates in an unsorted
list of n elements (implemented
with 2 nested loops)
30
O(log n) Logarithmic Time --- the log
of n is significantly lower than n
20 tasks = 4.3
Used in many “divide and conquer”
algorithms, like binary search, and is
the basis for using binary search
trees and heaps
31
O(log n)
for example, a binary search tree of
one million elements would take, at
most, 20 steps to locate a target
Binary Search in a Sorted list of n elements
Insert and Find Operations for a Binary
Search Tree with n nodes
Insert and Find Operations for a Heap with n
nodes
32
O(n log n) “n log n” Time
20 tasks = 86.4
More advanced sorting algorithms like
Quicksort, Mergesort (which will be
discussed in detail in the next chapter)
33
O(a^n) (a > 1) Exponential Time
20 tasks = 12^20 = very large number
Recursive Fibonacci
implementation (a > 3/2)
Towers of Hanoi (a=2)
Generating all permutations of n symbols
34
The best time in the preceding growth
rate functions is constant time O(1)
The worst time in the preceding growth
rate functions is exponential time
O(a^n) which quickly Overwhelms
even the fastest computers even for a
relatively small n
35
Polymonial Growth:
Linear, quadratic, cubic…
The Highest Power of N Dominates the
Polymonial (see Ill Lam SDT p.42 Top)
is considered manageable as compared
to exponential growth
36
Exponential
t
Quadratic
N log n
Linear
Log n
Constant O(1)
n
37
Log n has a slower asymptotic
growth rate when compared to
linear growth as a thousand fold
increase in the size of the task ,
n , results in a fixed, moderate
increase in the number of
operations required.
38
For a given ALGORITHM, you can see
how it falls on the following grid to
determine its “Order of Growth” or
time efficiency.
Use this grid as a “Rule of Thumb”
when evaluating the BIG-O of an
algorithm.
39
Let this grid help narrow down
possible solutions, but make sure you
“memorize” the other charts and use
them when attacking an order of
growth problem
BIG-O Analysis handout has an
example where the “rule of thumb” will
result in an incorrect assumption
40
BIG-O --- Big-O Notation
Rule of Thumb
ALGORITHM
|
SINGLE LOOP
|
NESTED LOOP(S)
____________ |________________________|__________________
STRAIGHT
|
LINEAR
|
QUADRATIC
FORWARD
|
O(N)
|
O(N^2)
PROCESSING |
|
0(N^3)…
(SEQUENTIAL) |
|
____________ |________________________|___________________
|
|
DIVIDE AND
|
LOGARITHMIC
|
N LOG N
CONQUER
|
O(LOG N)
|
O(N LOG N)
PROCESSING |
|
41
Sample Test Times; N = 50,000
FUNCTION
Log N (Log Time)
Running Time
15.6 (Binary Search)
N (Linear Time)
50,000 (L’List or Tree
Traversal, Sequential Search)
N Log N
780,482 (Quick, MergeSort)
N^2 (Quadratic Time) 2.5 * 10^9 (Selection Sort,
Matrics, 2 nested loops)
a^N (Exponential )
3.8 * 10^21 (recursion,
fibonacci, permutations)
42
REVIEW 3 EXAMPLES IN THE
HANDOUT:
Lambert p.40-41 Examples 1.7, 1.8 &
1.9
43
PROJECTS:
BIG-O Exercises 1 through 5
Workbook problems 12 through 20
Multiple Choice Problems
44
TEST IS THE DAY
AFTER THE LABS
ARE DUE
45