Managing the Assured Information Sharing Lifecycle

Download Report

Transcript Managing the Assured Information Sharing Lifecycle

Framework for Managing the
Assured Information Sharing Lifecycle
2008 MURI project with UMBC, Purdue, U. Texas Dallas,
U. Illinois, U. Texas San Antonio, and U. Michigan
Objectives:
• Create a new framework for assured information sharing
recognizing that sharable information has a lifecycle of
production, release, advertising, discovery, acquisition and
use
• Develop techniques grounded in this model to promote
information sharing while maintaining appropriate security,
privacy and accountability
• Evaluate, adapt and improve the AIS concepts and algorithms in relevant demonstration systems and test beds
See http://aisl.umbc.edu/ for papers and more information
February 2009
AIS Lifecycle Approach
Information value chain
• Design a service oriented
information has a
All aspects of the
lifecycle involving a
lifecycle are shaped by
advertize
web of producers and
distributed information
architecture to support the
consumers
sharing policies
assured information sharing
discover
release
lifecycle
Integration and
access may
creates
involve negotiating
• Create new policy models & mining
new information
policy defined
that may be
obligations
languages to express and en- shared
use
acquire
force AIS rules & constraints
• Develop new data mining techniques
and algorithms to track provenance, increase quality and
preserve privacy
• Model underlying organizational social networks to estimate
trust and information novelty
• Design incentive structures to motivate sharing in
organizations and coalitions
February 2009
Selected AISL Recent Results
① Progress on models, architectures, languages and
mechanisms for trustworthiness-centric assured
information sharing (UTSA, Purdue)
② Techniques for resolving conflicting facts extracted
from different resources (UIUC)
③ Study of information sharing motivation and quality in
online forums (Michigan)
④ Modeling incentives & trust in info. sharing (UTD)
⑤ Learning statistically sound trust metrics (UTD)
⑥ Inferring access policies from logs (UMBC)
⑦ Policies for privacy in mobile information systems
(UMBC, Purdue)
February 2009
Trustworthiness-centric AIS Framework
• Objective: create a trustworthiness-centric
assured information sharing framework
• Approach: design models, architectures,
language and mechanisms to realize it
• Key challenges:
1
- Trustworthiness and risk management for end-user
decision making
- Usage management to extends access control
- Attack management, including trustworthiness of
infrastructure services
- Identity management extending current generation
- Provenance management for managing trustworthiness
of data, software, and requests
February 2009
trustworthiness-centric assured
information sharing framework
Usage
management (of
authorized
activities)
Attack
management (of
unauthorized
activities)
Risk
management
Trustworthiness
management
Identity management
(of people,
organizations, and
devices)
1
Provenance
management (of
data, software, and
requests)
Note: “trustworthiness  risk” in general
February 2009
Progress on Trustworthiness-centric AIS
• Initial framework will be published as:
S. Xu, R. Sandhu & E. Bertino, Trustworthiness-centric
Assured Information Sharing, (invited paper), 3rd IFIP Int.
Conf. on Trust Management, 2009
• Design for identity & provenance mgmt underway
• Group-centric info sharing model extends traditional
dissemination one with new intuitive metaphors:
secure meeting room and subscription service
• Developed family of security models for semantics
of basic group operations (join, leave, add, remove)
and proved security properties about them
• Results published in recent conference papers
1
February 2009
Truth Discovery with Multiple Conflicting
Information Providers [TKDE’08]

Heuristic Rule 2: A web
Problem: Multiple
information provider may
site that provides mostly
provide conflictive facts on
true facts for many
the same object
objects will likely provide
 E.g., different author
true facts for other objects
names for a book
Web sites
Facts
Objects
 Which is the true fact?
 Heuristic Rule 1: The false
f1
w1
o1
facts on different web sites
f2
are less likely to be the same w2
f3
or similar
w3
 False facts are often
f4
o2
introduced by random
w4
f5
factors
February 2009
2

Truth-Discovery: Framework Extension

Multi-version of truth


Truth may change with time


Dynamic information network mining for veracity analysis in
multiple data streams
Current Testing Data Sets

2
Incremental updates with recent data in data streams
Method: Veracity-Stream


A player may win first but then lose
Truth is a relative, dynamically changing judgment


Democrats vs. republicans may have different views
Google News: A dynamic news feed that provides functions and
facilitates to search and browse 4,500 news sources updated
continuously
February 2009
Motivation & quality in information sharing
3
• Analyzed online Q&A forums: 2.6M
Knowledge iN
questions, 4.6M answers and interviews
with 26 top answerers
• Motivations to contribute include: altruism,
learning, competition (via point system) and
as a hobby
• Users who contribute more often and less
intermittently contribute higher quality
information
• Users prefer to answer unanswered
questions and to respond to incorrect
answers
• See “Questions in, Knowledge iN? A Study of Naver's Question
Answering Community”, Nam, Ackerman, Adamic, CHI 2009
February 2009
Incentives & Trust in Assured Information Sharing
• Goal: Create means of encouraging desirable
behavior within an environment which lacks or
cannot support a central governing agent
• Approach: Combining intelligence through a loose
alliance
– Bridges gaps due to sovereign boundaries
– Maximizes yield of resources
– Discovery of new information through correlation, analysis
of the ‘big picture’
– Information exchanged privately between two participants
• Drawbacks to sharing include misinformation and
freeloading
4
FEARLESS engineering
Our Model
• Players assumed to be rational
• The game of information trading
– Strategies: be truthful, lie, refuse to participate
– One game played for each possible pair of players, all games
played simultaneously in a single round; game repeated ‘infinitely’
– Players may verify the information they received with some cost
• When to verify becomes aspect of game
– Always verifying works poorly in light of honest equilibrium
behavior but never verifying may yield game to lying opponents
• Add EigenTrust to game
– A distributed trust metric where each player asks others for their
opinion of a third
– Based on known perfect information
4
FEARLESS engineering
Simulation Results
• We set δmin = 3, δmax = 7, CV = 2
• Lie threshold is set 6.9
• Honest behavior wins %97
percent of the time if all
behaviors exist.
• Experiments show without
LivingAgent behavior, honest
behavior cannot flourish.
“Incentive and Trust Issues in Assured Information Sharing”,
Ryan Layfield, Murat Kantarcioglu, and Bhavani Thuraisingham,
International Conference on Collaborative Computing, 2008
4
FEARLESS engineering
Learning statistically sound trust scores
• Goal: Build a statistically sound trust-based
scoring system for effective access control
through the application of the credit scoring
system
• Approach: Find appropriate predictive variables
by applying concepts and methodologies used in
credit scoring systems
Incorporate a utility function into the scoring system to
set up score-related access policies
• Phase 1
Access
Request
5
Trust
Calculator
• Phase 3
• Phase 2
Access
Privilege
• Phase 4
Trust Policies
Trust-Based Access Control Processes
• Phase 5
Interaction
Follow-Up
February 2009
Inferring RBAC Policies
• Problem: A system whose access policy is known is
more vulnerable to attacks and insider threat
Attackers may infer likely policies from
access observations, partial knowledge
of subject attributes, and background
knowledge
• Objective: Strengthen policies
against discovery
• Approach: Explore techniques to
propose policy theories via machine
learning such as ILP
• Results: promising initial results for
6 simple Role Based Access Control policies
February 2009
Privacy policies for mobile computing
• Problem: mobile devices collect and integrate
sensitive private data about their users which
they would like to selectively share with others
• Objective: Develop a policy-based system for
information sharing with an interface enabling
end users to write & adapt privacy policies
• Approach: prototype component for
iConnect on an iPhone and evaluate in
a University environment
• Example policy rules: share my exact
location with my family; share current
7 activity with my close friends, …
February 2009