3-4-PartII-InformedSearch

Download Report

Transcript 3-4-PartII-InformedSearch

Search Part II
•
Search Strategies
• Heuristic Functions
• Local Search Algorithms
• Summary
Introduction
Informed search strategies:
 Use problem specific knowledge (?)
 Can find solutions more efficiently than
search strategies
Greedy Best-First Search
 Expand node with lowest evaluation
function f(n)
 Function f(n) estimates the distance to
the goal.
Simplest case: f(n) = h(n) estimates cost of
cheapest path from node n to the goal.
** HEURISTIC FUNCTION **
Greedy Best-First Search
 Resembles uniform-cost search
 Follows the most promising path
 Non-optimal
 Incomplete
 Space and time complexity is O(bm)
A* Search
Evaluation Function:
F(n) = g(n) + h(n)
Path cost from root
to node n
Estimated cost of
cheapest path from
node n to goal
A* Search; optimal
Optimal.
• Optimal if admissible heuristic: never
overestimates the cost to reach a goal.
• Function f is monotonic
A* Search; optimal
• Assume optimal solution cost is C*
• Suboptimal node G2 appears on the fringe:
f(G2) = g(G2) + h(G2) > C*
• Assume node n is an optimal path:
f(n) = g(n) + h(n) <= C*
Therefore f(n) < C* < f(G2)
A* Search; complete
• A* is complete.
A* builds search “bands” of increasing f(n)
At all points f(n) < C*
Eventually we reach the “goal contour”
• Optimally efficient
• Most times exponential growth occurs
Contours of equal f-cost
400
420
Memory-bounded search
To reduce the memory requirement of A*
We can use iterative deepening: IDA*
The cutoff is the f-cost(g+h) rather than depth.
Learning to Search
Agents can learn to search…
Learning to Search
The idea is to search at the meta-level space.
Each state here is a search tree.
The goal is to learn from different search
strategies to avoid exploring useless
parts of the tree.
Informed Search and Exploration
•
Search Strategies
• Heuristic Functions
• Local Search Algorithms
• Summary
8-Puzzle
Common candidates:
f1: Number of misplaced tiles
f2: Manhattan distance from each tile to its
goal position.
Effective Branching Factor
N: total number of nodes generated by A*
d: solution depth
b* : branching factor a uniform tree of depth
d would have to contain N+1 nodes.
N + 1 = 1 + b* + (b*)2 + … + (b*)d
Effective Branching Factor
Example: N = 6; d = 2; b* = 2
Example: N = 3; d = 2; b* = 1
Effective Branching Factor
Good heuristics have a b* close to 1.
Experiments:
• Generate thousands of random problems
• Vary the solution length
• Solve with f1 and f2.
Results: b*(f2) >= b*(f1)
Informed Search and Exploration
•
Search Strategies
• Heuristic Functions
• Local Search Algorithms
• Summary
Local Search Algorithms
 If the path does not matter we can deal
with local search.
 They use a single current state
 Move only to neighbors of that state
Advantages:
 Use very little memory
 Often find reasonable solutions
Local Search Algorithms
 It helps to see a state space landscape
 Find global maximum or minimum
Complete search: always finds a goal
Optimal search: finds global maximum
or minimum.
Hill Climbing
 Moves in the direction of increasing value.
 Terminates when it reaches a “peak”.
 Do not look beyond immediate neighbors.
Hill Climbing
Can get stuck for some reasons:
a. Local maxima
b. Ridges
c. Plateau
Variants: stochastic hill climbing.
 random uphill move
 generate successors randomly until one is better
 random-restart hill
Simulated Annealing
o Instead of picking the best move pick
randomly
o If improvement is observed, take the step
o Otherwise accept the move with some
probability
o Probability decreases with “badness” of step
o And also decreases as the temperature goes
down
Local beam search
i.
ii.
iii.
iv.
Keep track of best k states
Generate all successors of these k states
If goal is reached stop.
If not, select the best k successors
and repeat.
Evolutionary Methods
Methods inspired by biological evolution.
Main ideas:
Population of
solutions
Generate new
solutions (offspring)
Assign a score
to each solution
Retain the best
solutions
Genetic Algorithms
Fundamental unit: a chromosome.
A chromosome is a bit of strings
representing a solution to our problem.
00010011110011111
A solution is good if it has a high fitness value.
(equivalent to heuristic function)
Parameters
 Θ : A fitness threshold above which we stop
(take best solution).
 L: The number of solutions in current
population.
 Pco: Fraction of solutions replaced
by cross-over.
 Pmut: the mutation rate.
Algorithm
Initialize population with L solutions at random
Repeat
For each solution h compute its fitness
Rank all solutions based on their fitness value
Generate new population using genetic operators
Until max_fitness > Θ
Return the solution with highest fitness
Genetic Operators
Crossover:
The most common is called single-point crossover:
Initial Strings
Crossover Mask
11101001000
Offspring
11101010101
11111000000
00001010101
00001001000
Genetic Operators
Mutation:
Produce small changes to a string at random.
Under a uniform distribution, we select one bit
and switch its value.
00001010101
01001010101
Replication:
A chromosome is simply reproduced,
and left unchanged.
New Population
1. Replicate (1-Pco)L members of P and add
them to Ps. The probability of selecting a
member is as follows:
P(hi) = Fitness (hi) / Σj Fitness (hj)
2. Crossover.- select (Pco L)/2 pairs of hypotheses
from P according to P(hi). For each pair (h1,h2)
produce two offspring by applying the Crossover
operator. Add all offspring to Ps.
New Population
3. Mutate.- Choose (Pmut L) members of Ps
with uniform probability. Invert one bit in
the representation randomly.
4. Update P with Ps
Fitness Function and Selection
Selection.
Typical approach is “fitness proportionate
selection” or “roulette wheel selection”:
P(hi) = Fitness (hi) / Σj Fitness (hj)
Other options are rank selection
John H. Holland
Born in Indiana 1929.
Author of “Adaptation in Natural and Artificial Systems”
written in 1975, providing the basis of genetic algorithms.
Recipient of the McArthur Fellowship.
J.R. Koza
Author of “Genetic Programming I II III and IV”.
Professor Stanford University
Member of several editorial boards.
Informed Search and Exploration
•
Search Strategies
• Heuristic Functions
• Local Search Algorithms
• Summary
Summary
• Heuristics help to reduce search of costs
• We looked at best-first search and A*
• We can compare performance using the
effective branching factor
• We covered local search algorithms:
hill climbing, simulated annealing,
local beam search.
• We looked at evolutionary methods