Transcript US10

Eindhoven University of Technology
April 16th, 2003
Evolutionary Systems
Paul CRISTEA
“Politehnica” University of Bucharest
Spl. Independentei 313, 77206 Bucharest, Romania,
Phone: +40 -21- 411 44 37, Fax: +40 -21- 410 44 14
e-mail: [email protected]
1
Lecture Outline
Evolutionary Systems
1. INTRODUCTION
2. EVOLUTIONARY SYSTEMS
3. GENETIC ALGORITHMS
3. EVOLUTIONARY INTELLIGENT
AGENT CONCEPT
4. EVOLUTIONARY INTELLIGENT
AGENT MODEL
5. ONTOLOGY AND ARCHITECTURE
6. CONCLUSIONS
2
Biological Evolution
Current estimates:
the universe began 15 billion years ago;
the earth was formed 5000 million years ago,
the first living organism appeared 3500 million years ago
The ancestral cell - simple bag of chemicals enclosed
in a membrane. It contained a program of instructions encoded
on a DNA molecule. The program consisted of sub-programs
called genes, which directed various chemical reactions inside
the cell: reactions to import food, reactions to convert food into
energy, reactions to maintain the membrane, and so on.
Most significantly, some genes directed chemical reactions that
enabled the cell to replicate itself.
As the ancestral cell replicated itself, the genes gradually
changed, creating different species of progeny which were
3
adapted to different environments.
Evolution
Evolution is seen as based on the trial-and-error process of
variation and natural selection of systems at all levels of
complexity.
Artificial selection -- specific features are retained or eliminated
depending on a goal or intention.
(e.g., the objective of a cattle breeder who would like to have
cows that produce more milk).
Natural selection -- from Darwinian theory of biological
evolution. Implicit goal of natural selection is maintenance or
reproduction of a configuration at some level of abstraction.
The selection is natural in the sense that there is no actor or
purposive system making the selection.
The selection is purely automatic or spontaneous, without plan
or design involved.
Evolution typically leads to greater complexity.
4
Selection or self-organization?
Criticisms against Darwinian view of evolution.
(1) There are designs or plans guiding evolution
(not discussed here),
(2) Natural selection must be complemented by selforganization in order to explain evolution.
(Jantsch, 1979; Kauffman, 1993; Swenson, 1997).
The specific interpretation of Darwinism sees evolution as the
result of selection by the environment acting on a population
of organisms competing for resources.
The winners of the competition -- those most fit to gain the
resources necessary for survival and reproduction -are selected, the others are eliminated.
5
Over Darwinian view
This view of evolution entails two strong restriction:
1. it assumes that there is a multitude ("population")
of configurations undergoing selection;
2. assumes that selection is carried out by their
common environment.
It cannot explain the evolution of a "population of one".
In the current, more general interpretation, there is no need
for competition between simultaneously present configurations:
A configuration can be selected or eliminated independently
of the presence of other configurations: a single system can
pass through a sequence of configurations, some of which
are retained while others are eliminated.
The only "competition" is one between subsequent states of the
6
same system, but the selection is still "natural".
Self-organization
Selection does not presuppose the existence of an environment
external to the configuration undergoing selection. The selection is
inherent in the configuration itself.
E.g., configurations can be intrinsically stable or unstable:
A crystal vs a cloud of gas molecules, both in vacuum, will retain or
not their structure.
Self-organization -- the asymmetric transition from varying to
stable.
Natural selection encompasses both external, Darwinian selection,
and internal, self-organizing selection.
7
Evolutionary Algorithms
• Evolutionary algorithms are good coarse search
techniques that can search enormous problem spaces.
• A population of possible solutions are scored on how well
they solve some problem.
• The more fit a solution is, the more part it plays in parenting
the next generation.
• New solutions are bred by combining components of the
parents, and applying mutation to introduce variability (new
aspects to the solutions).
• Evolution is slow and prone to loss of diversity, even if
forced with higher mutation rate.
• The greedier the EA version, the faster the population will
converge but the less likely they are to converge to the true
optimum.
8
Genetic Algorithms
A genetic algorithm (GA) is a computer model of the evolution
of a population of artificial individuals.
Each individual (k = 1,..., n; n is the population size) is
characterized by a chromosome (genotype) Sk, which
determines the individual fitness (phenotype) f(Sk).
The chromosome (genotype) is a string of symbols,
Sk = (Sk1, Sk2,...,SkN),
where N is the string length.
The symbols Sk1 are interpreted as genes of the chromosome Sk.
The evolution process consists of successive generations
t = 0, 1, … described by their population {Sk(t)}.
9
GA Steps
• Step 0. Generate a (random) initial population {Sk(0)}.
• Step 1. Evaluate the fitness f(Sk) of each individual Sk in the
population {Sk(t)}.
• Step 2. Select the individuals Sk according to their fitness f(Sk)
and apply to selected chromosomes the genetic operators:
• recombinations,
• point mutations,
to generate the offspring population {Sk(t+1)}.
• Step 3. Repeat the steps 1,2 for t = 0, 1, 2, ... , until some
convergence criteria (the maximum fitness in the population ceases
to increase, t reaches the certain value) is satisfied.
10
Genetic Algorithm Flow Diagram
11
General Features of GA 1
GAs are optimization techniques that posses an implicit
parallelism: different partial effective gene combinations
(called “schemata”) are searched in a parallel manner,
simultaneously, for all combinations.
Note: the smaller a combination is, the quicker it can be
found.
The GA scheme is very similar to that of quasispecies.
Main difference: recombinations are not included in quasispecies
model, whereas namely recombinations play the main role to find
new good solutions in GAs (mutation intensity is usually very
small in GAs).
12
General Features of GA 2
• In principle, GA are general algorithms, but the genetic operators
(population crossover and mutation) are application-specific ;
• The GA itself is extremely simple. The power of the algorithm
comes from the fact that it does two basic things:
1) it continuously improves and,
2) it explores solutions which may provide additional improvements.
• Both operations are encompassed in the genetic operators of
population crossover and mutation, which manipulate the genes of
the individuals to produce the continuously improving and
experimentation properties of the GA.
13
Specificity vs Versatility
• The genes of the individuals -- the genotype -- are used to
determine how it behaves (i.e., how well it solves the problem) -- the
phenotype.
• The genetic operators manipulate the genes, thus they must be tied
to the representation of the genes.
•Genetic operators that are specific to the problem domain.
•Significant research has been done, attempting to determine
universal genetic operators, based on universal gene representations.
• Unfortunately, these attempts have not been successful and it has
been shown that problem specific encodings typically out perform
universal encodings [DeJong and Spears, 1993], [Radcliff and
George, 1993].
14
Crossover
One-point crossover (analog to the biological one)
For the parents S1 = (S11, S12,...,S1N)
and S2 = (S21, S22,..., S2N),
the children are (S11,..., S1m, S2,m+1,...,S2N)
and (S21,..., S2m, S1,m+1,...,S1N);
i.e., a head and a tail of an offspring chromosome are taken from
different parents.
Two-point and several point crossovers can be used similarly.
Crossovers are sometimes supplemented by inversions,
which consist in reverses of the symbol order in a part of a
chromosome -- can help finding the best combinations of
symbols in the chromosome strings.
15
Uniform recombination
The symbols of the chromosome of the first offspring are
taken from either of the parents (S1 or S2 ) randomly for any
symbol position, whereas the second offspring has the
remainder symbols.
E.g., two children of S1 and S2 can have the chromosomes:
(S11, S22, S13, S14,...,S2N) and
(S21, S12, S23, S24,...,S1N).
16
GA Schemes
There are a number of particular GA schemes, which differ in:
• methods of selection,
• recombination,
• chromosome representation, etc.
A standard GA works on a binary string chromosome
(symbols Ski take the values 0 or1) of fixed length (N = const)
and applies fitness-proportionate selection, one-point
crossovers, and one-flip mutations.
Fitness-proportionate selection: the parents Sk of the
individuals in the new population are selected with
probabilities proportional to their fitness f(Sk).
Ranking selection: a certain number of best the individuals of
the population {Sk(t)} are used as parents of a new
generation.
17
Faster Evolution
• Elitist approach: Good solutions are copied into the
next generation (elites);
• Islands: physical, casts, in time (reincarnation);
• Local search: Using a local heuristic;
• Avoid greedy algorithms for better but slower
results;
• Using computer clusters.
18
Estimating Fitness
• Fitness evaluation takes most of the computing time;
• Reduce the number of true fitness evaluations in favour of
quick fitness estimates;
• Need to keep track of how reliable the fitness estimate is;
• A solution with too low a reliability needs to be truly
evaluated.
where f is the fitness, R the reliability of the child (0-1).
• S1, S2 - the similarity between the child and parent 1, 2;
• R1, R2 - reliability of parent 1, 2.
19
Partial Fitness Estimation 1
After Tim Hendtlass, Swinburne University of Technology, Melbourne, Australia.
20
Partial Fitness Estimation 2
21
Partial Fitness Estimation 3
22
Partial Fitness Estimation 4
23
Partial Fitness Estimation 5
24
Partial Fitness Estimation 6
25
Hybrid Systems
• There is always a limit to how much you can speed
evolution;
• Generations of potential solutions need to pass more
than their genetic material to succeeding generations.
• This can be by a historic (Akashic) record.
• Previously explored points, good or bad, are stored
for use by later generations.
26
Modelling the actual surface
Actual surface function
Explored points
27
Use of history
Maps of the points visited
without history.
Larger domain investigated
using history to avoid
re-exploring old teritory.
28
Statistics of results in using memory
29
Comparison of Results
30
Summary of GA
GAs are computer program models, based on a general
understanding of the genetic evolution.
GAs are universal heuristic optimization methods, which
require little knowledge about a problem to be solved.
They are also effective methods of combinatorial
optimization.
31
Classic references on GA
1. Holland, J.H., Adaptation in Natural and Artificial
Systems, Ann Arbor, MI: The University of Michigan
Press. 2nd edn. Boston, MA: MIT Press (1992).
2. Goldberg, D.E., Genetic Algorithms in Search,
Optimization, and Machine Learning, Addison-Wesley
(1989).
3. Mitchell, M., An Introduction to Genetic Algorithms, MIT
Press, Cambridge, MA (1996).
32
Evolutionary Systems
Capability to:
Evolve by changing the gene pool of a population
from generation to generation by such processes as:
• Mutation, Genetic Drift, Gene Flow,
• Crossing-Over,
• Selection.
Adapting the behavior to the environment
Able to address real-world problems
involving:
• Chaos,
• Randomness,
• Complex nonlinear dynamics
33
Evolutionary Systems - Life Game
The adaptive challenge is determined by the population, the
environment and the interactions between and within them.
Reductionist models stress either the role of the population or that
of the environment, and usually take into account only the evolution
through selection, while ignoring learning and competition.
In such studies, simple reactive agents have been considered,
with the behavior described by a sensorimotor map, i.e.,
a table of behavioral rules of the form:
IF <environment feature Ei is sensed >
THEN <do behavior Bj >.
This approach has the advantage of keeping the model simple
enough for directly deriving quantitative results about the efficiency
of accomplishing the adaptive task at the level of the population,
but can not be used to investigate the effects of the more complex
cognitive capabilities of the agents.
34
Example of Simple
Reactive Systems
The Ant Farm
Roberto Aguirre Maturano
35
Goal and Model
Goal: Emulate ants ability to coordinate into the task of
food-gathering, by mean of short-span individual reactions
to environment events.
Model:
1.
Individual ant brain does not have enough capacity
to remember food or nest locations.
2.
Ants react to the environment secreting scents, which
remain on the ground as odor traces.
3.
Odor traces have a limited persistence, vanishing as
time pass.
4.
When an ant finds some odor trace of interest, it
increases it adding some extra amount of scent.
5. Ants cannot 'clean' odor traces; just increase or
ignore them
36
Model Description
Ants secret two kinds of odor marks:
Brown - to mark the nest.
Green - after finding food.
Explorer behavior
When leaving the nest, ants move randomly in search of food, leaving an
odor trace of decreasing intensity as they go farther from the nest.
Deliverer behavior
When ants find food, they mark its location using a food trace, analogous to
the one used when they find the nest.
Ants carrying food follow the food trace from higher to lower intensity; the
nest trace is followed from lower to higher intensity.
Tracker behavior
Ants carrying no food follow food traces from lower to higher intensity.
37
Intelligent Agents
Capability to:
• Learn,
• Communicate,
• Establish complex, yet flexible organizational structures
Operate in dynamic and uncertain
environments
Robust and scalable software systems
Agent-based computation allows
improved:
•
Modeling,
•
Design,
•
Implementation
38
Intelligent
Agents
Evolutionary
Computation
Evolutionary Intelligent Agents
Bring together the two main forces of adaptation:
• learning - occurring at the level of each agent and
at the time scale of agent life,
• evolution - taking place at the level of the population and
unfolding at the time scale
of successive generations.
39
EIAs are Autonomous Agents provided with a genotype that controls
their capability to carry out various tasks, i.e., their phenotype.
EIAs can adapt efficiently to their environment by using
synergetically both learning and evolution.
EIA
EIAs can address the problem of adaptation to nonstationary
environments, i.e., to real-life complex and non-predictable
environments as the nowadays worldwide computer networks, or
the user friendly learning/teaching systems.
Current applications of the concept:
• Multiresolutional Conceptual Learning [A. Meystel, 2000],
• EIA based Information Retrival [F.B.Pereira and E. Costa, 1999,
2000],
• EIA based Personalized Web Learning/Teaching
[A. Cristea, T. Okamoto, P. Cristea, 2000 ],
• Genetic Estimation of Competitive Agents Behavior
[A.M.Florea, 2000],
• Intelligent Signal and Image Processing [P.Cristea, 2000].
40
E I A CONCEPT
The behavior of an EIA is not a mere automatic response to
stimuli from the environment, but is governed by its knowledge
about the world.
41
World representation
An agent holds subjective, partial information about the environment,
at two levels of world representation:
• sensorial level - depicted in a sensorial map constructed with
tactile and visual inputs,
• cognitive level - in a cognitive map,
based on the information in the sensorial map,
modified and enriched through
- some heuristic processing and with
- the information received by communicating
with other agents in the same team
The agent decides what actions to undertake based on the subjective
information in the cognitive map and on previous knowledge
expressed in behavior rules.
It sends the movement requests to the environment and updates its
knowledge base knowing the results of these requests.
42
Cognitive resources
Agent j
Cognitive
Resources
Agent k
Cognitive
reality
Cognitive
reality
Perception
Perception
“Linguistic”
communication
Sensorial
reality
Sensorial
reality
Sensory Input
Actions
Sensory Input
Actions
“Telepathic”
communication
External reality
43
Learning
Learning occurs mainly at the level of individuals that modify their
current knowledge by using the outcome of their own experience.
Learning can also have a cooperative dimension,
the agents communicating through a certain language.
The successful representation of the environment or the successful
behavioral rules can thus be shared within the population.
The decision to accept received knowledge remains with each
individual; new knowledge is appropriated only
if it fits the existing knowledge of that individual, or
if the agent rates its own current knowledge as unsatisfactory
(i.e. incomplete, uncertain or contradictory).
44
Evolution
Evolution occurs at the scale of the population and involves
genetic mechanisms that act over successive generations.
Both reactive and cognitive features of the agents can be
genetically controlled.
An agent’s genotype is expressed in its phenotype -- the entirety
of its capabilities. No interactions within the genome are considered;
every gene encodes a unique feature in the phenotype. Some genes
are of binary type, controlling the dichotomy existence nonexistence
of some capabilities. Other genes specify quantitatively the value of
some parameters that determine the intensity of agent features.
The reproduction is asexual, meaning that all agents have similar
roles in reproduction. However, along with single-parent duplication,
i.e., cloning, perturbed/enriched by low probability small random
mutations, crossing-over -- a two-parent operator -- is also
45
considered.
The cognitive resources of an agent can be genetically transmitted,
i.e. inherited from its parent(s):
• essential data,
• basic rules,
• mappings.
The cognitive resources are continuously evolving during the life
of the agent, both by accumulation of sensory input and by
learning/refining processes at various levels.
Baldwin effect
Some of these acquired cognitive resources can also be genetically
transmitted, under certain circumstances.
J. M. Baldwin, A new factor in evolution, American Naturalist, 30, 1896 ,441 – 451.
The sensorial and the cognitive maps of the parent(s) can be inherited
by the offspring.
46
Learning in a population
Better Fitness
Initial
Population
(1)
Trained
Population
(2)
Advance of a population in the feature space
under the effect of learning.
47
Evolution in a population
Better Fitness
Initial
Population
(1)
Initial
Population
+
Selection
“of the fittest”
Evolved
Population
(3)
Offspring
(2)
Advance of a population in the feature space
under the effect of evolution
48
Learning & Evolution
Better Fitness
Trained
Initial
Population
(1)
Population
(2)
Evolved
Population
(3)
Advance of a population in the feature space
under the combined effect of learning and evolution. 49
E I A MODEL
A prototype of the EIA system has been implemented
for study purposes,
to experimentally investigate the EIA concept.
The model is quite simple,
but illustrates the basic features of an EIA system.
A sensorimotor type of agents has been considered,
evolving in a two-dimensional world
and performing several simple tasks.
According to the concept, the EIAs have
not only a reactive behavior,
but also cognitive features.
50
Model Description 1
The system comprises one or more agent populations - teams.
Agents from different teams interact only by acting in the same
environment.
Agents from the same team also interact directly, e.g., through
message exchange, genetic interactions, etc.
All the agents move synchronously and make at most one movement
at each step.
The world is a rectangular lattice with strong boundary conditions.
Any location is considered adjacent to its eight surrounding
neighboring locations.
51
Model Description 2
An agent may move into a neighboring location, if accessible.
Walls and domain margins are permanently inaccessible locations.
Two agents cannot be in the same location at the same time.
If two agents attempt to occupy the same location, there is a collision
and only one of the agents succeeds, according to the agents’ push
strength.
Some of the grid nodes contain a certain amount of resources - seeds.
Each population has assigned some special locations on the grid nests.
The task of an agent is to pick up the resources and carry them to the
nests.
This specific task is a pre-programmed objective of the agent.
52
Model Description 3
The fitness of an individual agent is quantified by its energy.
The agent starts with an initial energy.
There is an energy cost associated to each action and
an energy bonus at the completion of a task.
The existence, behavior and reproduction of an agent
depend on its energy:
• IF < the energy falls below a threshold >
THEN < the agent can be destroyed >
• IF < the energy rises above a threshold >
THEN < the agent can replicate and new agents are created >.
.
53
ONTOLOGY AND ARCHITECTURE
An agent is described by:
State attributes -- can change at every step with the
state of the agent
Permanent attributes -- specified when the agent is created and
changed only by genetic operations
54
State & Morphology
State attributes
Position – the location of the agent in the grid that forms the world,
Orientation – one of the eight neighboring locations,
Load – the amount of resources carried by the agent.
Permanent attributes
• Actuator attributes -- determine directly the agent action results
Speed – number of movements an agent can make in a given time interval,
Capacity – the maximum amount of resources an agent can acquire,
Push strength – determines the agent that wins in a collision
• Sensor attributes -- determine the agent’s sensorial capabilities
Visual Range -- sets the depth of the visual field
• Behavior attributes -- internal attributes of the agent, without direct influence
on the environment, not visible to the environment and other agents.
Memory Size -- limits the amount of information retained by an agent,
Weighting parameters -- for target selection from multiple
55
potential targets
Visual field
The visual field for Visual Range = 5 and for two different
orientations of an agent
56
Fields
The locations in the grid are of four different types:
• Walls - not accessible to the agents, used to create a maze configuration
in which the agents evolve and search their targets: resources and nests.
The borders of the grid are also marked as walls.
• Nests - where the agents of a team have to deliver resources.
No agent picks up resources from a nest.
• Spaces - contain a non-negative amount of non-renewable resources.
An agent passing through a space location consumes the resources
and increases its Carried Seeds value until the amount of resources
in that location becomes zero or the agent reaches its Capacity.
• Generators - model renewable resources.
An agent entering a generator location consumes the available resources
like in a space location, but after a certain delay the amount of resources
in that location is incremented with a preset step, until a preset maximum
resource amount is reached. If the delay is set to zero the resource amount
is constant, i.e., non-exhaustible.
57
Architecture
The architecture of the EIA system comprises two components:
• Server - managing the environment;
• Client - that communicate over an IP network.
Several clients can connect simultaneously to the server, modeling several agent
populations acting together in the same environment.
The clients can be different applications running different agent control algorithms,
as long as they respect the communication protocol.
The server implements the world model. It manages the environment in which the
agents are acting and controls the state of the agents.
The information about the world stored by the server is objective, complete and upto-date.
The agents send movement requests to the server, which:
• analyses all the requests,
• estimates the possible interactions between the agents,
• determines the resulting configuration of the world.
The feedback from the server provides the agents with tactile (contact) item
identification capabilities.
The server also establishes the visual information received by each agent in
58
accordance to its sensorial attributes and dispatches it to the corresponding agent.
Block representation
Environment
(Server machine)
Agent population
(Client machine)
Agents
Agent population
(Client machine)
Agents
Agent population
(Client machine)
Agents
Architecture of an EIA system
59
Functional aspects
A single client machine hosts an entire agent population (team), to facilitate the
implementation of population-level features such as establishing a certain
level of agent collaboration or implementing genetic interactions between
the agents.
The agents remain quasi-autonomous, their actions being decided at the individual
agent level, not at the population level.
The client application comprises two modules:
• one implementing the intelligent agents and
• another implementing a population manager.
An agent decides what movements to make based on
• the information in its cognitive map,
• the behavior rules. another implementing a population manager.
•
The agent sends action movement requests to the environment and updates its
knowledge base knowing the results of these requests.
The communication between the agents in the same team takes place by
exchanging information at the level of the cognitive map.
60
Viability - reproducibility
The population manager acts as a middle layer between
the agents in the population and the environment.
• Computes the energy value for the agents in the population,
rewarding or taxing them according to their actions.
• Destroys the low energy agents and replicates the high energy ones.
• Performs the evolutionary operations, implementing the genetic interaction and
applying mutations to individuals of the same population.
Destruction
Probability
Replication
1
0.8
Probability of
agent destruction
and replication.
0.6
0.4
0.2
0
Ed
Eintial
Er
Agent Energy
61
Energy management
The client process computes the energy according to the results
received from the server.
The energy parameters have the same value for all the agents in the
team and are set by the user when initializing the client.
The current energy of an agent E is a positive value.
Each agent starts having an energy Einitial.
The energy decreases with a fixed amount Estep
for each step made by the agent.
There is an additional energy cost for a lost conflict (collision).
When the agent succeeds in delivering resources to a nest of the
team, it receives a fixed amount Ebonus
for each resource unit (seed ) it delivers.
62
Viability - reproducibility 2
If the energy falls below a threshold Ed , the agent may be destroyed
with the probability:
E

, for E  E d
1 
Pd   E d
0, otherwise

If the energy is higher than another threshold Er , the agent may
replicate.
After replication, a new child agent is created with the energy Einitial.
The energy of the parent agent decreases with the same amount Einitial.
The parent can replicate again as long as its energy remains above Er .
The probability for replication has been chosen:
1  exp  aE  E r , for E  E r
Pr  
0, otherwise
63
Implementation details
The genotype is encoded in a bit string.
The genotype includes the permanent attributes specific to an agent population.
During each simulation step, there is a low probability that a mutation occurs to
an agent chosen randomly in the population.
A mutation flips randomly one of the bits of the encoded genotype.
A crossover operation can occur between two agents from the same population,
if they happen to be placed in adjacent locations.
A double-point crossover operator over all the attributes encoded in the
genotype is used.
The probabilities for crossover and mutation are user modifiable parameters.
When an agent replicates, it creates a new agent having a copy of its genotype,
except for possible mutations.
The agent knowledge may be genetically transmitted or not: the new agent can
either start with blank maps or inherit the maps from its parent.
The user selects the desired behavior for the whole population before the
simulation begins.
Genetic transmission of acquired features leads to Baldwin effect.
64
Information Retrieval
IR basic stages:
• Formulating queries;
• Finding documents;
• Determining relevance
65
Library vs Internet IR
Traditional IR systems:
• Static and centralized collections of directly accessible
documents;
• Concerned only with Formulating queries & Determining
relevance
Finding documents on the Web:
• Millions of documents, distributed on many independent
servers;
• Dynamic nature of the environment, updating of information;
• Structured as a graph where documents are connected by
hyperlinks.
Altavista and Yahoo use indexing databases storing efficiently the
representation of a large number of documents.
66
Experiment 1: Ideal Query
67
After Francisco Pereira and Ernesto Costa, 2000
Experiment 2: Incomplete (weak) Query
After Francisco Pereira and Ernesto Costa, 2000
68
Experiment 3: Incomplete (intermediate) Query
69
After Francisco Pereira and Ernesto Costa, 2000
CONCLUSIONS
The lecture presents
Evolutionary Systems
and focusses on the new concept of
Evolutionary Intelligent Agents (EIA).
70
CONCLUSIONS 2
EIA concept brings together features of Intelligent Agents and the
Evolutionary / Genetic Algorithms and Genetic Programing
approaches.
There are already strong enough reasons to believe that this new
idea allows addressing highly complex real-life problems - ones
involving chaotic disturbances, randomness, and complex nonlinear
dynamics, that traditional algorithms have been unable to handle.
The EIAs have the potential to use the two main forces of
adaptation: learning and evolution.
71
CONCLUSIONS 3
There are already several successful applications of EIA to
problems like:
• Multiresolutional Conceptual Learning,
• EIA based Web Information Retrieval,
• EIA based Personalized Web English Language Teaching,
• Intelligent Signal and Image Processing.
72
73
74
75