Introduction CS 239 Security for Networks and System Software
Download
Report
Transcript Introduction CS 239 Security for Networks and System Software
Privacy and Anonymity
CS 236
Advanced Computer Security
Peter Reiher
May 6, 2008
CS 236, Spring 2008
Lecture 6
Page 1
Groups for This Week
1.
2.
3.
4.
5.
6.
7.
8.
Golita Benoodi, Darrell Carbajal, Sean MacIntyre
Andrew Castner, Chien-Chia Chen, Chia-Wei Chang
Hootan Nikbakht, Ionnis Pefkianakis, Peter Peterson
Yu Yuan Chen, Michael Cohen, Zhen Huang
Jih-Chung Fan, Vishwa Goudar, Michael Hall
Abishek Jain, Nikolay Laptev, Chen-Kuei Lee
Chieh-Ning Lien, Jason Liu, Kuo-Yen Lo
Min-Hsieh Tsai, Peter Wu, Faraz Zahabian
CS 236, Spring 2008
Lecture 6
Page 2
Outline
•
•
•
•
Privacy vs. security?
Data privacy issues
Network privacy issues
Some privacy solutions
CS 236, Spring 2008
Lecture 6
Page 3
What Is Privacy?
• The ability to keep certain information
secret
• Usually one’s own information
• But also information that is “in your
custody”
• Includes ongoing information about
what you’re doing
CS 236, Spring 2008
Lecture 6
Page 4
Privacy and Computers
• Much sensitive information currently
kept on computers
– Which are increasingly networked
• Often stored in large databases
– Huge repositories of privacy time
bombs
• We don’t know where our information
is
CS 236, Spring 2008
Lecture 6
Page 5
Privacy and Our Network
Operations
• Lots of stuff goes on over the Internet
– Banking and other commerce
– Health care
– Romance and sex
– Family issues
– Personal identity information
• We used to regard this stuff as private
– Is it private any more?
CS 236, Spring 2008
Lecture 6
Page 6
Threat to Computer Privacy
• Cleartext transmission of data
• Poor security allows remote users to access
our data
• Sites we visit can save information on us
– Multiple sites can combine information
• Governmental snooping
• Location privacy
• Insider threats in various places
CS 236, Spring 2008
Lecture 6
Page 7
Privacy Vs. Security
•
•
•
•
•
•
Best friends?
Or deadly rivals?
Does good security kill all privacy?
Does good privacy invalidate security?
Are they orthogonal?
Or can they just get along?
CS 236, Spring 2008
Lecture 6
Page 8
Conflicts Between Privacy and
Security
• Many security issues are based on strong
authentication
– Which means being very sure who
someone is
• If we’re very sure who’s doing what, we
can track everything
• Many privacy approaches based on
obscuring what you’re doing
CS 236, Spring 2008
Lecture 6
Page 9
Some Specific Privacy Problems
• Poorly secured databases that are remotely
accessible
– Or are stored on hackable computers
• Data mining by companies we interact with
• Eavesdropping on network communications
by governments
• Insiders improperly accessing information
• Cell phone/mobile computer-based location
tracking
CS 236, Spring 2008
Lecture 6
Page 10
Data Privacy Issues
• My data is stored somewhere
– Can I control who can use it/see it?
• Can I even know who’s got it?
• How do I protect a set of private data?
– While still allowing some use?
• Will data mining divulge data “through
the back door”?
CS 236, Spring 2008
Lecture 6
Page 11
Personal Data
• Who owns data about you?
• What if it’s real personal data?
– Social security number, DoB, your DNA
record?
• What if it’s data someone gathered about
you?
– Your Google history or shopping records
– Does it matter how they got it?
CS 236, Spring 2008
Lecture 6
Page 12
Controlling Personal Data
• Currently impossible
• Once someone has your data, they do
what they want with it
• Is there some way to change that?
• E.g., to wrap data in way to prevent its
misuse?
• Could TPM help?
CS 236, Spring 2008
Lecture 6
Page 13
Tracking Your Data
• How could I find out who knows my
passport number?
• Can I do anything that prevents my data
from going certain places?
– With cooperation of those who have it?
– Without their cooperation?
– Via legal means?
CS 236, Spring 2008
Lecture 6
Page 14
Protecting Data Sets
• If my company has (legitimately) a
bunch of personal data,
• What can I/should I do to protect it?
– Given that I probably also need to
use it?
• If I fail, how do I know that?
– And what remedies do I have?
CS 236, Spring 2008
Lecture 6
Page 15
Options for Protecting Data
• Careful system design
• Limited access to the database
– Networked or otherwise
• Full logging and careful auditing
• Using only encrypted data
– Must it be decrypted?
– If so, how to protect the data and the
keys?
CS 236, Spring 2008
Lecture 6
Page 16
Data Mining and Privacy
• Data mining allows users to extract
models from databases
– Based on aggregated information
• Often data mining allowed when direct
extraction isn’t
• Unless handled carefully, attackers can
use mining to deduce record values
CS 236, Spring 2008
Lecture 6
Page 17
Insider Threats and Privacy
• Often insiders need access to private
data
– Under some circumstances
• But they might abuse that access
• How can we determine when they
misbehave?
• What can we do?
CS 236, Spring 2008
Lecture 6
Page 18
Network Privacy
• Mostly issues of preserving privacy of
data flowing through network
• Start with encryption
– With good encryption, data values
not readable
• So what’s the problem?
CS 236, Spring 2008
Lecture 6
Page 19
Traffic Analysis Problems
• Sometimes desirable to hide that
you’re talking to someone else
• That can be deduced even if the data
itself cannot
• How can you hide that?
– In the Internet of today?
CS 236, Spring 2008
Lecture 6
Page 20
Location Privacy
• Mobile devices often communicate
while on the move
• Often providing information about
their location
– Perhaps detailed information
– Maybe just hints
• This can be used to track our
movements
CS 236, Spring 2008
Lecture 6
Page 21
Implications of Location Privacy
Problems
• Anyone with access to location data
can know where we go
• Allowing government surveillance
• Or a private detective following your
moves
• Or a maniac stalker figuring out where
to ambush you . . .
CS 236, Spring 2008
Lecture 6
Page 22
Some Privacy Solutions
• The Scott McNealy solution
– “Get over it.”
• Anonymizers
• Onion routing
• Privacy-preserving data mining
• Preserving location privacy
• Handling insider threats via optimistic
security
CS 236, Spring 2008
Lecture 6
Page 23
Anonymizers
• Network sites that accept requests of
various kinds from outsiders
• Then submit those requests
– Under their own or fake identity
• Responses returned to the original
requestor
• A NAT box is a poor man’s
anonymizer
CS 236, Spring 2008
Lecture 6
Page 24
The Problem With Anonymizers
• The entity running knows who’s who
• Either can use that information himself
• Or can be fooled/compelled/hacked to
divulge it to others
• Generally not a reliable source of real
anonymity
CS 236, Spring 2008
Lecture 6
Page 25
Onion Routing
• Meant to handle issue of people
knowing who you’re talking to
• Basic idea is to conceal sources and
destinations
• By sending lots of crypo-protected
packets between lots of places
• Each packet goes through multiple
hops
CS 236, Spring 2008
Lecture 6
Page 26
A Little More Detail
• A group of nodes agree to be onion
routers
• Users obtain crypto keys for those
nodes
• Plan is that many users send many
packets through the onion routers
– Concealing who’s really talking
CS 236, Spring 2008
Lecture 6
Page 27
Sending an Onion-Routed Packet
• Encrypt the packet using the
destination’s key
• Wrap that with another packet to
another router
– Encrypted with that router’s key
• Iterate a bunch of times
CS 236, Spring 2008
Lecture 6
Page 28
In Diagram Form
Source
Destination
Onion routers
CS 236, Spring 2008
Lecture 6
Page 29
What’s Really in the Packet
CS 236, Spring 2008
Lecture 6
Page 30
Delivering the Message
CS 236, Spring 2008
Lecture 6
Page 31
What’s Been Achieved?
• Nobody improper read the message
• Nobody knows who sent the message
– Except the receiver
• Nobody knows who received the
message
– Except the sender
• Assuming you got it all right
CS 236, Spring 2008
Lecture 6
Page 32
Issues for Onion Routing
• Proper use of keys
• Traffic analysis
• Overheads
– Multiple hops
– Multiple encryptions
CS 236, Spring 2008
Lecture 6
Page 33
Privacy-Preserving Data Mining
• Allow users access to aggregate
statistics
• But don’t allow them to deduce
individual statistics
• How to stop that?
CS 236, Spring 2008
Lecture 6
Page 34
Approaches to Privacy for Data
Mining
• Perturbation
– Add noise to sensitive value
• Blocking
– Don’t let aggregate query see sensitive
value
• Sampling
– Randomly sample only part of data
CS 236, Spring 2008
Lecture 6
Page 35
Preserving Location Privacy
• Can we prevent people from knowing
where we are?
• Given that we carry mobile
communications devices
• And that we might want locationspecific services ourselves
CS 236, Spring 2008
Lecture 6
Page 36
Location-Tracking Services
• Services that get reports on our mobile
device’s position
– Probably sent from that device
• Often useful
– But sometimes we don’t want them
turned on
• So, turn them off then
CS 236, Spring 2008
Lecture 6
Page 37
But . . .
• What if we turn it off just before
entering a “sensitive area”?
• And turn it back on right after we
leave?
• Might someone deduce that we spent
the time in that area?
• Very probably
CS 236, Spring 2008
Lecture 6
Page 38
Handling Location Inferencing
• Need to obscure that a user probably
entered a particular area
• Can reduce update rate
– Reducing certainty of travel
• Or bundle together areas
– Increasing uncertainty of which was
entered
CS 236, Spring 2008
Lecture 6
Page 39
Privacy in the Face of
Insider Threats
• A real problem
• Recent UCLA medical center case of
worker who peeked at celebrity records
– They had the right to look at records
– But only for the appropriate reasons
• Generally, how do we preserve privacy
when insider needs some access?
CS 236, Spring 2008
Lecture 6
Page 40
One Approach
• A better access control model
• Don’t give insider full access
• Only allow access when he really
needs it
• But how do you tell?
• Could use optimistic access control
CS 236, Spring 2008
Lecture 6
Page 41
Optimistic Access Control
• Don’t normally allow access to
protected stuff
– Even if worker sometimes needs it
• On need, require worker to request
exceptional action
– And keep record it happened
• Audit log of exceptional actions later
CS 236, Spring 2008
Lecture 6
Page 42
A Sample Scenario
• Nurse X needs to access medical
records of patient Y
• Nurse X doesn’t have ordinary access
• She requests exceptional access
• Which is granted
– And a record is made of her request
CS 236, Spring 2008
Lecture 6
Page 43
Checking the Scenario
• An auditor (human or otherwise)
examines all requests for sensitive
access
• If “not reasonable,” reports to an
authority
• In this case, nurse X acted properly
– Since patient Y was being treated
CS 236, Spring 2008
Lecture 6
Page 44
How About Misbehavior?
• Nurse X requests medical information
on celebrity Z
• The request is logged
• The authority finds it bogus
– Good question: How?
• Nurse X’s ass is fired
CS 236, Spring 2008
Lecture 6
Page 45
How Does This Help?
• Every access to a sensitive record is
logged
– Well, it could have been, anyway
• Worker is alerted to the sensitive
nature of his actions
• There’s a built-in mechanism for
validating access
CS 236, Spring 2008
Lecture 6
Page 46
Will It Be Too Cumbersome?
• Depends
• On frequency of requests
• And auditor’s ability to identify
illegitimate accesses
• Also requires reasonable remediation
method
• And strong authentication
CS 236, Spring 2008
Lecture 6
Page 47