Transcript tenBig

Algorithms
Al-Khwarizmi, arab mathematician,
8th century
Wrote a book: al-kitab… from which
the word Algebra comes
Oldest algorithm: Euclidian
algorithm for finding the largest
common divisor of two numbers
Ten Big Ideas in Algorithms
1. Orders of Magnitude and Big Oh notation
2. Complexity: Polynomial versus Exponential
3. Iteration versus Recursion
4. Recurrence Relations
5. Divide-and-Conquer
6. Graph Traversal: Depth-First, Breadth-First
7. Greedy Algorithm
8. Dynamic Programming
9. Lower Bounds
10. NP-completeness
1. Orders of Magnitude
• How to compare algorithm A with algorithm B in
an implementation-independent manner?
• Asymptotic analysis: compare them when n goes
to infinity (becomes very large)
• [An aside: massive data sets; the Internet and
search engines; Bioinformatics]
• a(n)=running time of alg A, b(n)=of alg B
• A is better than B if
lim a(n)/b(n) = 0
(b “grows faster” than b)
• Examples in Mathematica
Big Oh Notation
• Too many functions, not all components
relevant
• Big Oh helps keep the big picture.
• Example:
a(n) = 3 n2 + 1 and b(n) = 5 n2 + 3 n + 100
grow roughly at the same rate
• lim a(n) / b(n) = constant
• a(n) = O(n2), b(n) = O(n2)
Complexity:
Polynomial versus Exponential
• Polynomial: order growth
of O(n), O(n2), O(n3), etc.
is tractable.
• Exponential: order growth
of O(2n), O(nn) or larger
is intractable and leads to
impractical algorithms
3. Iteration versus Recursion
• Find Min of n elements: iterate over the
array, at each step do a comparison.
Time = n
• Sort n elements by Selection Sort: iterate
over array, at each iteration Find Min, put in
right position
Time = n + (n-1) + (n-2) + … + 3 + 2 + 1
• Find closed form for summation:
Time = n(n+1)/2= O(n2)
Recursion
• Binary Search: split the list in two, search in
the half where it should be.
• Merge Sort: divide the array in two equal
parts; sort each part recursively; merge
them.
• Quick Sort: use first element to partition
array in two, first being smaller than
second; sort each part recursively.
4. Recurrence Relations
• Time for Merge Sort:
F(n) = 2 F(n/2) + n
• Solve recurrence relation:
F(n) = n log n
5. Divide and Conquer
• Paradigm for algorithm design: split in two or
more parts, solve recursively, combine solutions
• Binary Search, Merge Sort and Quick Sort
illustrate it.
• Leads to improved algorithms for many problems:
o Median finding
o Matrix Multiplication
o Computational Geometry algorithms (convex hulls, etc.)
• Leads to divide and conquer recurrence relations.
6.Graph Traversal:
Depth-First and Breadth-First
• Techniques for designing very efficient
(linear time) algorithms
• Apply to a variety of graph problems:
•
•
•
•
•
Connectivity
Finding cycles in graphs
Strongly connected components in directed graphs
Biconnected components in graphs
Planarity testing
7. Greedy Algorithm
• Proceed iteratively. At each step, choose an
element which maximizes some “profit” or
minimizes some “cost”.
• Minimum spanning tree in a graph (min cost
connecting network): at each step, pick up the
least expensive edge that does not create a cycle in
the graph.
• Shortest paths in a graph from given vertex a: at
each step, choose a new vertex which is closest to
an already constructed set of vertices.
8. Dynamic Programming
• A scheme for combining several results for
smaller values to obtain the current result.
• Often leads to polynomial time algorithms
• All pairs shortest paths: compute the
shortest path between all pairs of nodes in a
graph.
9. Lower Bounds
• When do you stop improving an algorithm? How to
prove that what you found is the best possible
algorithm?
• Lower bounds: techniques for proving an algorithm
is the best possible.
• Work under certain assumptions about the model of
computation
• Classical bounds: finding the minimum, sorting.
• Information Theoretical Lower Bound
• Adversary arguments
10.NP-Completeness
• Despite all efforts, hundreds of very practical
problems have no efficient (polynomial time)
known algorithm (doesn’t mean that one may not
be found)
• Nobody has been able to prove that no polynomial
time algorithm exists for any of these problems.
• But one can show that if a polynomial time
algorithm exists for one of these problems, then it
exists for all of them.
• A problem is NP-complete if it is as hard (NPhard) as any of the problems in this class, and if a
solution to the problem can be verified in
polynomial time (it is in NP)
Some NP-complete Problems
•
•
•
•
Traveling Salesman Problem
Hamiltonian Cycle in a graph
Maximum Clique in a graph
Satisfiability for boolean formulas and
circuits
• The game of Push-Push 
Overview of the course
•
•
•
•
•
We cover the 10 big ideas
Lots of examples of algorithms
Start with some algebra and calculus
Need data structures and recursion
Use Leda software for demos (all algorithms are
implemented and visualized)
• Abstract, high level class
• Need to be comfortable with abstract thinking
• A little bit of programming – mostly theory