04/16 - Montana State University
Download
Report
Transcript 04/16 - Montana State University
Genetic Algorithms
ML 9
Kristie Simpson
CS536: Advanced Artificial
Intelligence
Montana State University
Overview
Learning based on simulated evolution.
Begins with initial population of hypotheses.
Population uses genetic operations to create
the next generation.
Most fit hypotheses survive.
9.1 Motivation
Successful, robust method for biological
systems.
Hypotheses can contain complex
interacting parts, where the impact of
each part may be difficult to model.
Easily parallelized.
Easily understood.
Terminology
Population – collection of hypotheses.
Fitness – numerical measure of the strength
of a hypothesis.
Crossover (recombination) – two or more
hypotheses combine to create one or more
offspring.
Mutation – random modifications to individual
hypotheses.
9.2 Genetic Algorithms
Table 9.1 pg. 251
Population of initial hypotheses.
Fitness of each hypothesis is determined.
Create new generation.
–
–
–
–
Most fit hypotheses selected.
Crossover combines hypotheses to create
offspring.
Mutation modifies individual hypotheses.
Update population.
Where have we seen this before?
ACO 2.4.6 (pg. 55-57) Evolutionary
Computation (Other Metaheuristics)
ACO 3.7.1 (pg. 93-99) Lamarckian vs.
Darwinian (ACO plus Local Search)
ML 3.6.2 (pg. 65-66) Occam’s razor
(Inductive Bias)
9.2.1 Representing Hypotheses
Hypotheses in GAs are often represented by
bit-strings.
IF Wind = Strong THEN PlayTennis = yes
Outlook
111
Wind
10
PlayTennis
10
9.2.2 Genetic Operators
9.2.3 Fitness Function and Selection
fitness proportionate selection:
http://en.wikipedia.org/wiki/Fitness_proportionate_selection
9.3 An Illustrative Example
GABIL uses a GA to learn boolean concepts
represented by a disjunctive set of
propositional rules.
Hypotheses represented by bit-strings which
grow with the number of rules.
Variable length bit-strings requires
modification to the crossover rule.
GAs for the TSP
http://www.ads.tuwien.ac.at/raidl/tspga/TSPG
A.html
9.4 Hypothesis Space Search
GAs do not move smoothly from hypothesis to
hypothesis (like Backpropagation).
Instead, they move much more abruptly and are less
likely to fall into local minima.
Problem: crowding - highly fit individuals take over
population.
Solution: alter the selection function (tournament,
rank, fitness sharing, subspecies)
9.4.1 Population Evolution and the
Schema Theorem
Mathematically characterize evolution.
Schemas – patterns that describe sets of bit strings
(0s, 1s, *’s).
Evolution depends on selection, recombination, and
mutation.
9.5 Genetic Programming
Extends genetic algorithms to the evolution of
complete computer programs.
Population consists of computer programs rather
than bit-strings.
Population of hypotheses typically represented by
parse trees.
Fitness determined by executing the program on
training data.
Crossover performed by swapping subtrees.
9.6 Models of Evolution and Learning
What is the relationship between learning
during the lifetime of a single individual, and
the longer time frame species-level learning
afforded by evolution?
Lamarckian evolution – experiences of a
single organism directly affect the genetic
makeup of their offspring.
Scientific evidence contradicts this model.
9.6.2 Baldwin Effect
Evolution favors individuals with the
capability to learn.
Individuals who learn rely less strongly on
their genetic code.
Individual learning supports more rapid
evolutionary progress, thereby increasing the
chance that the species will evolve genetic,
non-learned traits.
9.7 Parallelizing Genetic Algorithms
GAs naturally suited to parallel
implementation.
Coarse grain – subdivide population into
groups of individuals (demes).
–
Migration – individuals from one deme are
copied/transferred to other demes.
Fine grain – one processor per individual in
the population.
–
Recombination occurs among neighboring
individuals.