Optimization

Download Report

Transcript Optimization

Optimization
1

Given measure of goodness (of fit)

Find optimal parameters (e.g correspondences)

That maximize goodness measure (or minimize
badness measure)

Optimization techniques

Direct (closed-form)

Search (generate-test)

Heuristic search (e.g Hill Climbing)

Genetic Algorithm
Ellen L. Walker
Direct Optimization


2
The slope of a function at the maximum or minimum is 0

Function is neither growing nor shrinking

True at “local” maxima and minima also!
Find where the slope is zero and you find extrema!
Ellen L. Walker
Optimization Example
3

Assume circle of radius R, centered at (0,0).

The squared radial distance from the circle to any point
(x,y) is R2 - x2 - y2. Where this distance is minimized,
the best circle has been found.

But - the minimum of this function is always at R=0! (the
value is -(x2 + y2)

We need to minimize an error value that can’t go below
zero -- square it again! (R2 - x2 - y2)2

Now the derivative is 4R(R2 - x2 - y2) which has zeros at
R=0 and R= ±(x2 + y2). The one that makes sense is
R=(x2 + y2).
Ellen L. Walker
Optimization: many points

If there are many points, it isn’t so easy. The equation to
optimize becomes a sum over all points (xi, yi):


This makes the derivative:


4
∑i (R2 - xi2 - yi2)2
∑i 4R(R2 - xi2 - yi2)
And the root that makes sense is:

0 = NR2 - ∑i (xi2 + yi2) where N is the number of points

R2 = 1/N (∑i (xi2 + yi2)) i.e. average distance of all points
from 0
Ellen L. Walker
As it gets more general, it gets harder!



5
Let the center vary: now we have 3 parameters

R, xcenter, ycenter

Minimize (R2 - (x-xcenter)2 - (y-ycenter)2)2 over all possible
(R,xcenter, ycenter) triples
Let it be an oriented ellipse: now we have 4 parameters

xcenter, ycenter, xradius, yradius

Minimize ((x-xcenter)2/xradius2 + (y-ycenter)2/yradius2)2
Allow even more general shapes (e.g. cubic splines)

Eventually, the closed form is much too complicated!

Use search techniques (heuristics) instead
Ellen L. Walker
Iterative Optimization : Hill Climbing
6

This is a form of trial and error

Single parameter example (circle centered at 0)

Given the goodness function at R, try the same function at
R+1 and R-1 : update R to whichever one has the best of
the 3 values. Repeat until R has the best value.

This method is guaranteed to find a minimum. (Only a
local min)

Can take a long time if you’re far

Cannot jump to another region (e.g. circle with arbitrary
center, 2 points: center between points vs. circle between
points)
Ellen L. Walker
Simulated Annealing
7

(Algorithm 7.11)

Algorithm is randomized: take a step if random number
is less than a value based on both the objective function
and the Temperature.

When Temperature is high, chance of going toward a
higher value of optimization function J(x) is greater.

Note higher dimension: “perturb parameter vector” vs.
“look at next and previous value”.
Ellen L. Walker
Genetic Algorithm

Quicker but randomized searching for an optimal
parameter vector

Operations

8

Crossover (2 parents -> 2 children)

Mutation (one bit)
Basic structure

Create population

Perform crossover & mutation (on fittest)

Keep only fittest children
Ellen L. Walker
Genetic Algorithm: Why does it work?

Children carry parts of their parents’ data

Only “good” parents can reproduce

Children are at least as “good” as parents?


Large population allows many “current points” in search

9
No, but “worse” children don’t last long
Can consider several regions (watersheds) at once
Ellen L. Walker
Genetic Algorithm: Issues & Pitfalls


10
Representation

Children (after crossover) should be similar to parent, not
random

Binary representation of numbers isn’t good - what
happens when you crossover in the middle of a number?

Need “reasonable” breakpoints for crossover (e.g.
between R, xcenter and ycenter but not within them)
“Cover”

Population should be large enough to “cover” the range of
possibilities

Information shouldn’t be lost too soon

Mutation helps with this issue
Ellen L. Walker
Experimenting With Genetic Algorithms

Be sure you have a reasonable “goodness” criterion

Choose a good representation (including methods for
crossover and mutation)

Generate a sufficiently random, large enough population

Run the algorithm “long enough”

Find the “winners” among the population

Variations: multiple populations, keeping vs. not
keeping parents, “immigration / emigration”, mutation
rate, etc.
11
Ellen L. Walker