Computational Trust and Reputation Models - LES PUC-Rio

Download Report

Transcript Computational Trust and Reputation Models - LES PUC-Rio

Computational Trust and
Reputation Models
Andrew Diniz da Costa
[email protected]
Presentation Outline
• Part 1: Introduction
– Motivation
– Some definitions
• Part 2: Computational trust and reputation models
– eBay/OnSale
– SPORAS & HISTOS
– Fire Model
– Governance Framework
• Part 3: ART-Testbed
– Overview
Andrew Diniz da Costa © LES/PUC-Rio
2
Presentation Outline
• Part 1: Introduction
– Motivation
– Some definitions
• Part 2: Computational trust and reputation models
– eBay/OnSale
– SPORAS & HISTOS
– Fire Model
– Governance Framework
• Part 3: ART-Testbed
– Overview
Andrew Diniz da Costa © LES/PUC-Rio
3
What we are talking about ...
Andrew Diniz da Costa © LES/PUC-Rio
4
What we are talking about ...
Andrew Diniz da Costa © LES/PUC-Rio
5
Advantages of trust and reputation mechanisms
• Agents can obtain data from others agents.
• Shared experience.
• Decide on which to trust
Andrew Diniz da Costa © LES/PUC-Rio
6
Problems of trust and reputation mechanisms
• Not all kind of environments are suitable to apply these
mechanisms.
• Exclusion must be a punishment
• What is trust?
• What is reputation?
Andrew Diniz da Costa © LES/PUC-Rio
7
Trust
• Some statements we like:
• “Trust begins where knowledge ends: trust provides a basis
dealing with uncertain, complex, and threatening images of
the future.” [Luhmann,1979]
• “There are no obvious units in which trust can be
measured,” [Dasgupta, 2000]
Andrew Diniz da Costa © LES/PUC-Rio
8
Reputation
• Some definitions:
• “The estimation of the consistency over time of an attribute
or entity” [Herbig et al.]
• “Information that individuals receive about the behaviour of
their partners from third parties and that they use to decide
how to behave themselves” [Buskens, Coleman...]
• “The opinion others have of us”
Andrew Diniz da Costa © LES/PUC-Rio
9
What is a good trust model?
• A good trust model should be [Fullam et al, 05]:
• Accurate
– provide good previsions
• Adaptive
– evolve according to behaviour of others
• Multi-dimensional
– Consider different agent characteristics
• Efficient
– Compute in reasonable time and cost
Andrew Diniz da Costa © LES/PUC-Rio
10
Why using a trust model in a MAS ?
• Trust models allow:
– Identifying and isolating untrustworthy agents
– Evaluating an interaction’s utility
– Deciding whether and with whom to interact
Andrew Diniz da Costa © LES/PUC-Rio
11
Presentation Outline
• Part 1: Introduction
– Motivation
– Some definitions
• Part 2: Computational trust and reputation models
– eBay/OnSale
– SPORAS & HISTOS
– Fire model
– Governance Framework
• Part 3: ART-Testbed
– Overview
Andrew Diniz da Costa © LES/PUC-Rio
12
eBay model
• Context: e-commerce
– Model oriented to support trust between buyer and seller
– Buyer has no physical access to the product of interest
– Seller or buyer may decide not to commit the transaction
– Centralized: all information remains on eBay Servers
• Buyers and sellers evaluate each other after transactions
• The evaluation is not mandatory and will never be removed
• Each eBay member has a “reputation” (feedback score) that
is the summation of the numerical evaluations.
Andrew Diniz da Costa © LES/PUC-Rio
13
eBay model
Andrew Diniz da Costa © LES/PUC-Rio
14
eBay model
Andrew Diniz da Costa © LES/PUC-Rio
15
SPORAS & HISTOS
• Context: e-commerce, similar to eBay
• An individual may have a very high reputation in one
domain, while she has a low reputation in another.
• Two models are proposed:
– Sporas: works even with few evaluations (ratings)
– Histos: assumes abundance of evaluations
• Ratings given by users with a high reputation are weighted
more
• Reputation values are not allowed to increase at infinitum
Andrew Diniz da Costa © LES/PUC-Rio
16
SPORAS & HISTOS
• SPORAS
– Reputations are in [0, 3000]. Newcommers = 0. Ratings are in
[0.1, 1]
– Reputations never get below 0, even in the case of very bad
behaviours
– After each rating the reputation is updated
• HISTOS
– Aim: compute a global ‘personalized reputation’ value for each
member
Andrew Diniz da Costa © LES/PUC-Rio
17
Fire Model
• Three types of reputation
– Interaction trust
– Witness reputation
– Certified reputation
* Huynh, T. D., Jennings, N. R. and Shadbolt, N. (2004) FIRE: an integrated trust and reputation model for open multi-agent
systems. In: 16th European Conference on Artificial Intelligence, 2004, Valencia, Spain.
Andrew Diniz da Costa © LES/PUC-Rio
18
Fire Model
• Interaction trust
– resulting from past experiences from direct interactions
– Between [-1, +1]
– -1 means absolutely negative
– +1 means absolutely positive
– 0 means neutral or uncertain
Interaction Trust of the Agent B
(price, quality, etc)
Request
Agent A
Provide
Agent B
Andrew Diniz da Costa © LES/PUC-Rio
19
Fire Model
• Witness reputation
– reports of witness about an agent’s behaviour
Agent C
Agent D knows Agent B
Request witness
Agent A
Agent D
Agent B
Agent E
Andrew Diniz da Costa © LES/PUC-Rio
20
Fire Model
• Certified reputation
– references provided by other agents about its behaviour
Evaluation of D made
by the agent A
Evaluation of A made
by the agent D
Agent D
Evaluation of B made
by the agent A
0,5
Agent A
-0,5
Agent B
Evaluation of A made
by the agent B
Agent C
Andrew Diniz da Costa © LES/PUC-Rio
21
Governance Framework
- GUEDES, José ; SILVA, V. T. ; LUCENA, Carlos José Pereira de . A Reputation Model Based on Testimonies. In: Kolp, M,
Garcia, A, Ghoze, C, Bresciani, P, Henderson-Sellers, B, Mouratidis, M.. (Org.). Agent-Oriented Information Systems.:
Springer-Verlag, 2008, v. LNAI, p. 37-52.
- DURAN, Feranda ; SILVA, V. T. ; LUCENA, Carlos José Pereira de . Using Testimonies to Enforce the behavior of Agents.
In: Sichman, J., Noriega, P., Padget, J. and Ossowski, S.. (Org.). Coordination, Organizations, Institutions and Norms in
Agent Systems III. : Springer-Verlag, 2008, v. LNAI, p. 218-231.
Andrew Diniz da Costa © LES/PUC-Rio
22
Governance Framework – Reputation System
• Three different kinds of reputations were defined:
– role reputation, norm reputation and global reputation.
• Role reputations only consider norms that were violated
while playing a specified role or lies that were told while
playing the role.
• Norm reputations focus on the violation of a norm and on
the lies told while considering a norm.
• The global reputation of an agent considers all violated
norms and all told lies.
Andrew Diniz da Costa © LES/PUC-Rio
23
Presentation Outline
• Part 1: Introduction
– Motivation
– Some definitions
• Part 2: Computational trust and reputation models
– eBay/OnSale
– SPORAS & HISTOS
– Fire Model
– Governance Framework
• Part 3: ART-Testbed
– Overview
Andrew Diniz da Costa © LES/PUC-Rio
24
Domain
Andrew Diniz da Costa © LES/PUC-Rio
25
Reputation Transaction Protocol
Andrew Diniz da Costa © LES/PUC-Rio
26
Opinion Transaction Protocol
Andrew Diniz da Costa © LES/PUC-Rio
27
Simulator
Andrew Diniz da Costa © LES/PUC-Rio
28
Competition
• 17 agents (1 didn’t execute) of 13 different institutions
• Two phases
– Preliminary
– Final
• Preliminary phase (May 10-11)
– 8 agents of the different institutions
– 15 agents offered by competition (5 “bad”, 5 “neutral”, 5 “bad”
dummies )
– 100 rounds
• Final phase (May 16-17)
– 5 best agents of the preliminary phase
– 15 agents offered by competition (5 “bad”, 5 “neutral”, 5 “bad”
dummies )
– 200 rounds
Andrew Diniz da Costa © LES/PUC-Rio
29
Preliminary Phase
Andrew Diniz da Costa © LES/PUC-Rio
30
Final Phase
1) Electronics & Computer Science, University of Southampton
2) Department of Math & Computer Science, The University of Tulsa
3) Department of Computer Engineering, Bogazici University
4) Agents Research Lab, University of Girona
5) Pontifícia Universidade Católica do Rio de Janeiro
Andrew Diniz da Costa © LES/PUC-Rio
31
Conclusion
• ART-Testbed is being useful, however:
– What is reputation?
– Unreal Domain
• Researches have worked in domains of the industry to apply
trust and reputation.
• Area is growing
• Famous researches are working in this area.
Andrew Diniz da Costa © LES/PUC-Rio
32
References
•
•
•
•
•
•
•
•
•
•
[AbdulRahman, 97] A. Abdul-Rahman. The PGP trust model. EDI-Forum: the Journal of Electronic
Commerce, 10(3):27–31, 1997.
[Barber, 83] B. Barber, The Logic and Limits of Trust, The meanings of trust: Technical competence
and fiduciary responsibility, Rutgers University Press, Rutgers, NJ, United States of America, 1983,
p. 7-25.
[Carbo et al., 03] J. Carbo and J. M. Molina and J. {Dávila Muro, Trust Management Through Fuzzy
Reputation, International Journal of Cooperative Information Systems, 2003, vol. 12:1, p. 135-155.
[Casare & Sichman, 05] S. J. Casare and J. S. Sichman, Towards a functional ontology of
reputation, Proceedings of AAMAS’05, 2005.
[Castelfranchi, 00] C. Castelfranchi, Engineering Social Order, Proceedings of ESAW’00, 2000.
[Castelfranchi & Falcone, 98] C. Castelfranchi and R. Falcone, Principles of trust for MAS:
Cognitive anatomy, social importance and quantification. Proc of ICMAS’98, pages 72-79, 1998.
[Conte & Paolucci, 02] R. Conte and M. Paolucci, Reputation in Artificial Societies. Social Beliefs
for Social Order, Kluwer Academic Publishers, G. Weiss (eds), Dordrecht, The Netherlands, 2002.
[Dellarocas, 00] C. Dellarocas, Immunizing online reputation reporting systems against unfair
ratings and discriminatory behavior, p. 150-157, Proceedings of the ACM Conference on "Electronic
Commerce“ (EC'00), October, ACM Press, New York, NY, United States of America, 2000.
[Dellarocas, 01] C. Dellarocas, Analyzing the economic efficiency of {eBay-like} online reputation
reporting mechanisms, p. 171-179, Proceedings of the ACM Conference on "Electronic Commerce"
(EC'01), October, ACM Press, New York, NY, United States of America, 2001.
[Demolombe & Lorini, 08] R. Demolombe and E. Lorini, Trust and norms in the context of
computer security: a logical formalization. Proc of DEON’08, LNAI, 1998.
Andrew Diniz da Costa © LES/PUC-Rio
33
References
•
•
•
•
•
•
•
•
•
[Fullam et al, 05] K. Fullam, T. Klos, G. Muller, J. Sabater-Mir, A. Schlosser, Z. Topol, S. Barber, J.
Rosenschein, L. Vercouter and M. Voss, A Specification of the Agent Reputation and Trust (ART)
Testbed: Experimentation and Competition for Trust in Agent Societies, Proceedings of AAMAS’05,
2005.
[Herzig et al, 08] A. Herzig, E. Lorini, J. F. Hubner, J. Ben-Naim, C. Castelfranchi, R. Demolombe,
D. Longin and L. Vercouyter. Prolegomena for a logic of trust and reputation, submitted to Normas
08.
[Luhmann, 79] N. Luhmann, Trust and Power, John Wiley \& Sons, 1979. [McKnight & Chervany,
02] D. H. McKnight and N. L. Chervany, What trust means in e-commerce customer relationship: an
interdisciplinary conceptual typology, International Journal of Electronic Commerce, 2002.
[Mui et al., 02] L. Mui and M. Mohtashemi and A. Halberstadt, Notions of Reputation in Multi-agent
Systems: A Review, Proceedings of Autonomous Agents and Multi-Agent Systems (AAMAS'02), p.
280-287, 2002, C. Castelfranchi and W.L. Johnson (eds), Bologna, Italy, July, ACM Press, New York,
NY, United States of America.
[Muller & Vercouter, 05] G. Muller and L. Vercouter, Decentralized Monitoring of Agent
Communication with a Reputation Model, Trusting Agents for trusting Electronic Societies, LNCS
3577, 2005.
[Pearl, 88] Pearl, J. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference,
Morgan Kaufmann, San Francisco, 1988.
[Rehák et al., 05] M. Rehák and M. Pěchouček and P. Benda and L. Foltn, Trust in Coalition
Environment: Fuzzy Number Approach, Proceedings of the Workshop on "Trust in Agent Societies" at
Autonomous Agents and Multi-Agent Systems (AAMAS'05), p. 132-144, 2005, C. Castelfranchi and
S. Barber and J. Sabater and M. P. Singh (eds) Utrecht, The Netherlands, July.
[Sabater, 04] Evaluating the ReGreT system Applied Artificial Intelligence, 18 (9-10) :797-813.
[Sabater & Sierra, 05] Review on computational trust and reputation models Artificial Intelligence
Review ,24 (1) :33-60.
Andrew Diniz da Costa © LES/PUC-Rio
34
References
•
•
•
•
•
•
[Sabater-Mir & Paolucci, 06] Repage: REPutation and imAGE among limited
autonomous partners, JASSS - Journal of Artificial Societies and Social Simulation ,9
(2), 2006.
[Schillo & Funk, 99] M. Schillo and P. Funk, Learning from and about other agents in
terms of social metaphors, Agents Learning About From and With Other Agents, 1999.
[Sen & Sajja, 02] S. Sen and N. Sajja, Robustness of reputation-based trust:
Boolean case, Proceedings of Autonomous Agents and Multi-Agent Systems
(AAMAS'02), p. 288-293, 2002, Bologna, Italy, M. Gini and T. Ishida and C.
Castelfranchi and W. L. Johnson (eds), ACM Press, New York, NY, United States of
America, vol.1.
[Shapiro, 87] S. P. Shapiro, The social control of impersonal trust, American Journal
of Sociology, 1987, vol. 93, p. 623-658.
[Steiner, 03] D. Steiner, Survey: How do Users Feel About eBay's Feedback System?
January, 2003, http://www.auctionbytes.com/cab/abu/y203/m01/abu0087/s02 .
[Zacharia et al., 99] G. Zacharia and A. Moukas and P. Maes, Collaborative
Reputation Mechanisms in Electronic Marketplaces, Proceedings of the Hawaii
International Conference on System Sciences (HICSS-32), vol. 08, 1999, p. 8026,
IEEE Computer Society, Washington, DC, United States of America.
Andrew Diniz da Costa © LES/PUC-Rio
35
Perguntas...
Andrew Diniz da Costa
[email protected]
Andrew Diniz da Costa © LES/PUC-Rio
36