Intelligent Agent Technology and Application
Download
Report
Transcript Intelligent Agent Technology and Application
Intelligent Agent
Technology and Application
Course overview
and
what is intelligent agent
©Intelligent Agent Technology and Application, 2006, Ai Lab NJU
What is intelligent agent
Field that inspired the agent fields?
– Artificial Intelligence
–
Software Engineering
–
2
Agent architecture, MAS, Coordination
Game Theory and Economics
Agent as an abstracted entity
Distributed System and Computer Network
–
Agent intelligence and micro-agent
?
Agent Negotiation
There are two kinds definition of agent
– Often quite narrow
– Extremely general
©Gao Yang, Ai Lab NJU
Agent
Sept. 2006
General definitions
American Heritage Dictionary
–
Russel and Norvig
–
”An agent is anything that can be viewed as perceiving its
environment through sensors and acting upon that
environment through effectors.”
Maes, Parrie
–
3
”... One that acts or has the power or authority to act ... or
represent another”
”Autonomous agents are computational systems that
inhabit some complex dynamic environment, sense and act
autonomously in this environment, and by doing so realize
a set of goals or tasks for which they are designed”.
©Gao Yang, Ai Lab NJU
Sept. 2006
Agent: more specific definitions
Smith, Cypher and Spohrer
–
Hayes-Roth
–
4
”Let us define an agent as a persistent software entity
dedicated to a specific purpose. ’Persistent’ distinguishes
agents from subroutines; agents have their own ideas
about how to accomplish tasks, their own agendas.
’Special purpose’ distinguishes them from multifunction
applications; agents are typically much smaller.
”Intelligent Agents continuously perform three functions:
perception of dynamic conditions in the environment;
action to affect conditions in the environment; and
reasoning to interpret perceptions, solve problems, draw
inferences, and determine actions.
©Gao Yang, Ai Lab NJU
Sept. 2006
Agent: industrial definitions
IBM
–
”Intelligent agents are software entities that carry out some
set of operations on behalf of a user or another program
with some degree of independence or autonomy, and in
doing so, employ some knowledge or representations of
the user’s goals or desires”
5
©Gao Yang, Ai Lab NJU
Sept. 2006
Agent: weak notions
Wooldridge and Jennings
–
6
An Agent is a piece of hardware or (more commonly) softwarebased computer system that enjoys the following properties
Autonomy: agents operate without the direct intervention of
humans or others, and have some kind of control over their
actions and internal state;
Pro-activeness: agents do not simply act in response to their
environment, they are able to exhibit goal-directed behavior by
taking the initiative.
Reactivity: agents perceive their environment and respond to
it in timely fashion to changes that occur in it.
Social Ability: agents interact with other agents (and possibly
humans) via some kind of agent-communication language.”
©Gao Yang, Ai Lab NJU
Sept. 2006
Agent: strong notions
Wooldridge and Jennings
–
7
Weak notion in addition to
Mobility: the ability of an agent to move around a
network
Veracity: agent will not knowingly communicate false
information
Benevolence: agents do not have conflicting goals and
always try to do what is asked of it.
Rationality: an agent will act in order to achieve its
goals and will not act in such a way as to prevent its
goals being achieved
©Gao Yang, Ai Lab NJU
Sept. 2006
Summary of agent definitions
An agent act on behalf user or another entity.
An agent has the weak agent characteristics. (Autonomy, Pro-
activeness, Reactivity, Social ability)
An agent may have the strong agent characteristics. (Mobility,
Veracity, Benevolence, Rationality)
8
©Gao Yang, Ai Lab NJU
Sept. 2006
Dear child gets many names…
9
Many synonyms of the term “Intelligent agent”
–
Robots
–
Software agent or softbots
–
Knowbots
–
Taskbots
–
Userbots
–
……
©Gao Yang, Ai Lab NJU
Sept. 2006
Autonomy is the key feature of agent
Examples
–
–
Thermostat
Control / Regulator
Any control system
Agent
Action
Input
Sensor
Input
Software Daemon
Print server
Http server
Environment
10
Most software daemons
©Gao Yang, Ai Lab NJU
Sept. 2006
Type of environment
An agent will not have complete control over its
environment, but have partial control, in that it can
influence it.
–
Classification of environment properties [Russell
1995, p49]
–
–
–
–
–
11
Scientific computing or MIS in traditonal computing.
Accessible vs. inaccessible
Deterministic vs. non-deterministic
Episodic vs. non-episodic
Static vs. dynamic
Discrete vs. continuous
©Gao Yang, Ai Lab NJU
Sept. 2006
Accessible vs. inaccessible
12
Accessible vs. inaccessible
–
An accessible environment is one in which the
agent can obtain complete, accurate, up-to-date
information about the environment’s state. (also
complete observable vs. partial observable)
–
Accessible: sensor give complete state of the
environment.
–
In an accessible environment, agent needn’t keep
track of the world through its internal state.
©Gao Yang, Ai Lab NJU
Sept. 2006
Deterministic vs. non-deterministic
13
Deterministic vs. non-deterministic
–
A deterministic environment is one in which any
action has a single guaranteed effect , there is no
uncertainty about the state that will result from
performing an action.
–
That is, next state of the environment is
completely determined by the current state and
the action select by the agent.
–
Non-deterministic: a probabilistic model could be
available.
©Gao Yang, Ai Lab NJU
Sept. 2006
Episodic vs. non-episodic
14
Episodic vs. non-episodic
–
In an episodic environment, the performance of
an agent is dependent on a number of discrete
episodes, with no link between the performance
of an agent in different scenarios. It need not
reason about the interaction between this and
future episodes. (such as a game of chess)
–
In an episodic environment, agent doesn’t need
to remember the past, and doesn’t have to think
the next episodic ahead.
©Gao Yang, Ai Lab NJU
Sept. 2006
Static vs. dynamic
Static vs. dynamic
–
A static environment is one that can assumed to
remain unchanged expect by the performance of
actions by the agents.
–
A dynamic environment is one that has other
processes operating on it which hence changes
in ways beyond the agent’s control.
15
©Gao Yang, Ai Lab NJU
Sept. 2006
Discrete vs. continuous
16
Discrete vs. continuous
– An environment is discrete if there are a fixed,
finite number of actions and percepts in it.
©Gao Yang, Ai Lab NJU
Sept. 2006
Why classify environments
The type of environment largely determines the
design of agent.
Classifying environment can help guide the agent’s
design process (like system analysis in software
engineering).
Most complex general class of environments
–
17
Are inaccessible, non-deterministic, non-episodic,
dynamic, and continuous.
©Gao Yang, Ai Lab NJU
Sept. 2006
Discuss about environment: Gripper
Gripper is a standard example for probabilistic
planning model
–
Robot has three possible actions: paint (P), dry
(W) and pickup (U)
–
State has four binary features: block painted,
gripper dry, holding block, gripper clean
18
–
Initial state:
–
Goal state:
©Gao Yang, Ai Lab NJU
Sept. 2006
Intelligent agent vs. agent
19
An intelligent agent is one that is capable of flexible
autonomous action in order to meet its design
objectives, where flexibility means three things:
–
Pro-activeness: the ability of exhibit goal-directed
behavior by taking the initiative.
–
Reactivity: the ability of percept the environment,
and respond in a timely fashion to changes that
occur in it.
–
Social ability: the ability of interaction with other
agents (include human).
©Gao Yang, Ai Lab NJU
Sept. 2006
Pro-activeness
20
Pro-activeness
–
In functional system, apply pre-condition and postcondition to realize goal directed behavior.
–
But for non-functional system (dynamic system), goal must
remain valid at least until the action complete.
–
agent blindly executing a procedure without regard to
whether the assumptions underpinning the procedure are
valid is a poor strategy.
Observe incompletely
Environment is non-deterministic
Other agent can affect the environment
©Gao Yang, Ai Lab NJU
Sept. 2006
Reactivity
21
Reactivity
–
Agent must be responsive to events that occur in
its environment.
–
Building a system that achieves an effective
balance between goal-directed and reactive
behavior is hard.
©Gao Yang, Ai Lab NJU
Sept. 2006
Social ability
22
Social ability
– Must negotiate and cooperate with others.
©Gao Yang, Ai Lab NJU
Sept. 2006
Agent vs. object
Object
–
23
Are defined as computational entities that
encapsulate some state, are able to perform
actions, or methods on this state, and
communicate by message passing.
Are computational entities.
Encapsulate some internal state.
Are able to perform actions, or methods, to change this
state.
Communicate by message passing.
©Gao Yang, Ai Lab NJU
Sept. 2006
Agent and object
24
Differences between agent and object
–
An object can be thought of as exhibiting
autonomy over its state: it has control over it. But
an object does not exhibit control over it’s
behavior.
–
Other objects invoke their public method. Agent
can only request other agents to perform actions.
–
“Objects do it for free, agents do it for money.”
–
(implement agents using object-oriented
technology)……Thinking it.
©Gao Yang, Ai Lab NJU
Sept. 2006
Agent and object
25
–
In standard object model has nothing whatsoever
to say about how to build systems that integrate
reactive, pro-active, social behavior.
–
Each has their own thread of control. In the
standard object model, there is a single thread of
control in the system.
–
(agent is similar with an active object.)
–
Summary,
Agent embody stronger notion of autonomy than object
Agent are capable of flexible behavior
Multi-agent system is inherently multi-threaded
©Gao Yang, Ai Lab NJU
Sept. 2006
Agent and expert system
Expert system
–
26
Is one that is capable of solving problems or
giving advice in some knowledge-rich domain.
The most important distinction
–
Expert system is disembodied, rather than being
situated.
–
It do not interact with any environment. Give
feedback or advice to a third part.
–
Are not required to interact with other agents.
©Gao Yang, Ai Lab NJU
Sept. 2006
Example of agents
Mobile
Customer
Mobile
Customer
Agent
(Peer)
Mobile
Customer
27
©Gao Yang, Ai Lab NJU
Agent
(Peer)
Agent
(Peer)
Agent
(Peer)
Mobile
Customer
Sept. 2006
Distributed Artificial Intelligence (DAI)
DAI is a sub-field of AI
DAI is concerned with problem solving where
agents solve (sub-) tasks (macro level)
Main area of DAI
–
Distributed problem solving (DPS)
–
Centralized Control and Distributed Data (Massively
Parallel Processing)
Multi-agent system (MAS)
Distributed Control and Distributed Data (coordination
crucial)
Some histories
28
©Gao Yang, Ai Lab NJU
Sept. 2006
DAI is concerned with……
Agent granularity (agent size)
Heterogeneity agent (agent type)
Methods of distributing control (among agents)
Communication possibilities
Distributed
AI
MAS
Distributed
– Coarse agent granularity Computing
– And high-level communication
Distributed
Problem
Solving
29
©Gao Yang, Ai Lab NJU
Artificial
Intelligence
Multi-Agent
Systems
Sept. 2006
DAI is not concerned with……
30
Issues of coordination of concurrent processes at
the problem solving and representational level.
Parallel computer architecture, parallel
programming languages or distributed operation
system.
No semaphores, monitors or threads etc.
Higher semantics of communication (speech-act
level)
©Gao Yang, Ai Lab NJU
Sept. 2006
Motivation behind MAS
To solve problems too large for a centralized agent
–
To allow interconnection and interoperation of
multiple legacy system
–
31
E.g. Financial system
E.g. Web crawling
To provide a solution to inherently distributed
system
To provide a solution where expertise is distributed
To provide conceptual clarity and simplicity of
design
©Gao Yang, Ai Lab NJU
Sept. 2006
Benefits of MAS
Faster problem solving
Decreasing communication
–
Higher semantics of communication (speech-act
level)
32
Flexibility
Increasing reliability
©Gao Yang, Ai Lab NJU
Sept. 2006
Heterogeneity degrees in MAS
Low
–
Medium
–
Identical agents, different resources
Different agent expertise
High
–
Share only interaction protocol (e.g. FIPA or
KQML)
33
©Gao Yang, Ai Lab NJU
Sept. 2006
Cooperative and self-interested MAS
34
Cooperative
–
Agents designed by interdependent designers
–
Agents act for increased good of the system (i.e. MAS)
–
Concerned with increasing the systems performance and
not the individual agents
Self-interested
–
Agents designed by independent designer
–
Agents have their own agenda and motivation
–
Concerned with the benefit of each agent (’individualistic’)
–
The latter more realistic in an Internet-setting?
©Gao Yang, Ai Lab NJU
Sept. 2006
Our categories about MAS
Cooperation
–
Competitive
–
Both has a common object
Each have different objects which are
contradictory.
Semi-competitive
–
Each have different objects which are conflictive,
but the total system has one explicit (or implicit)
object
The first now is known as TEAMWORK.
35
©Gao Yang, Ai Lab NJU
Sept. 2006
Distributed AI perspectives
Distributed
AI
Perspectives
Agent
T he or y
H yb
Gr
ou
p
ge
ua
c
re
La
ng
Ar
tu
er
e
er at iv
ec
ign
D el ib
iv e
t
hi
Des
R ea ct
aches
Appro
c
i
f
Speci
ri d
Coop
erat
ion
eds
Testb
Methods
tion
dina
Coor
©Gao Yang, Ai Lab NJU
Coh
Beh erent
avi
or
ns
io
at
c
i
pl
Ap
De
si
gn
s
ol
To
36
Planning
n
io
at
i
t
go
Ne
s
si
ly
a
An
Sept. 2006
Our Thinking in MAS
Single benefit vs. collective benefit
No need central control
Social intelligence vs. single intelligence
Self-organize system
–
37
Self-form, self-evolve
Intelligence is emergence, not innative
…..
©Gao Yang, Ai Lab NJU
Sept. 2006
Conclusions of lecture
Agent has general definition, weak definition and
strong definition
Classification of the environment
Differences between agent and intelligent agent,
agent and object, agent and expert system
Multi-agent system is macro issues of agent
systems
38
©Gao Yang, Ai Lab NJU
Sept. 2006
Coursework
39
1. Give other examples of agents (not necessarily
intelligent) that you know of. For each, define as
precisely as possible:
– (a). the environment that the agent occupies, the
states that this environment can be in, and the
type of environment.
– (b). The action repertoire available to the agent,
and any pre-conditions associated with these
actions;
– (c). The goal, or design objectives of the agent –
what it is intended to achieve.
©Gao Yang, Ai Lab NJU
Sept. 2006
Coursework
40
2. If a traffic light (together with its control system)
is considered as intelligent agent, which of agent’s
properties should be employ? Illustrate your answer
by examples.
©Gao Yang, Ai Lab NJU
Sept. 2006
Coursework
3. Please determine the environment’s type.
Chess
Poker
Minesweeper
Eshopping
Accessible??
Deterministic
??
Episodic??
Static??
Discrete??
41
©Gao Yang, Ai Lab NJU
Sept. 2006