Lecture 19 - The University of Texas at Dallas

Download Report

Transcript Lecture 19 - The University of Texas at Dallas

Introduction to Biometrics
Dr. Bhavani Thuraisingham
The University of Texas at Dallas
Lecture #19
Biometrics and Privacy - I
October 31, 2005
Outline
 Overview of Privacy
 Biometrics and Privacy
Some Privacy concerns
 Medical and Healthcare
- Employers, marketers, or others knowing of private
medical concerns
 Security
- Allowing access to individual’s travel and spending data
- Allowing access to web surfing behavior
 Marketing, Sales, and Finance
Allowing access to individual’s purchases
 Biometrics
- Biometric technologies used to violate privacy
-
Data Mining as a Threat to Privacy
 Data mining gives us “facts” that are not obvious to human analysts
of the data
 Can general trends across individuals be determined without
revealing information about individuals?
 Possible threats:
Combine collections of data and infer information that is private
 Disease information from prescription data
 Military Action from Pizza delivery to pentagon
 Need to protect the associations and correlations between the data
that are sensitive or private
-
Some Privacy Problems and Potential Solutions
 Problem: Privacy violations that result due to data mining
- Potential solution: Privacy-preserving data mining
 Problem: Privacy violations that result due to the Inference
- Inference is the process of deducing sensitive information from
the legitimate responses received to user queries
- Potential solution: Privacy Constraint Processing
 Problem: Privacy violations due to un-encrypted data
- Potential solution: Encryption at different levels
 Problem: Privacy violation due to poor system design
- Potential solution: Develop methodology for designing privacyenhanced systems
 Problem: Privacy violation due to Biometrics systems
- Privacy sympathetic Biometrics
Privacy Preserving Data Mining
 Prevent useful results from mining
- Introduce “cover stories” to give “false” results
- Only make a sample of data available so that an adversary is
unable to come up with useful rules and predictive functions
 Randomization
- Introduce random values into the data and/or results
- Challenge is to introduce random values without significantly
affecting the data mining results
- Give range of values for results instead of exact values
 Secure Multi-party Computation
- Each party knows its own inputs; encryption techniques used to
compute final results
Privacy Controller
User Interface Manager
Privacy
Constraints
Constraint
Manager
Query Processor:
Constraints during
query and release
operations
DBMS
Database Design
Tool
Update
Processor:
Constraints during
database design
operation
Constraints
during update
operation
Database
Semantic Model for Privacy Control
Dark lines/boxes contain
private information
Cancer
Influenza
Has disease
John’s
address
Patient John
address
England
Travels frequently
Platform for Privacy Preferences (P3P):
What is it?
 P3P is an emerging industry standard that enables
web sites to express their privacy practices in a
standard format
 The format of the policies can be automatically
retrieved and understood by user agents
 It is a product of W3C; World wide web consortium
www.w3c.org
 Main difference between privacy and security
User is informed of the privacy policies
User is not informed of the security policies
-
Platform for Privacy Preferences (P3P):
Key Points
 When a user enters a web site, the privacy policies
of the web site is conveyed to the user
 If the privacy policies are different from user
preferences, the user is notified
 User can then decide how to proceed
 User/Client maintains the privacy controller
- That is, Privacy controller determines whether
an untrusted web site can give out public
information to a third party so that the third
party infers private information
Platform for Privacy Preferences (P3P):
Organizations
 Several major corporations are working on P3P
standards including:
Microsoft
IBM
HP
NEC
Nokia
NCR
 Web sites have also implemented P3P
 Semantic web group has adopted P3P
-
Platform for Privacy Preferences (P3P):
Specifications
 Initial version of P3P used RDF to specify policies
 Recent version has migrated to XML
 P3P Policies use XML with namespaces for
encoding policies
 Example: Catalog shopping
Your name will not be given to a third party but
your purchases will be given to a third party
<POLICIES xmlns =
http://www.w3.org/2002/01/P3Pv1>
<POLICY name = - - - </POLICY>
</POLICIES>
-
Platform for Privacy Preferences (P3P):
Specifications (Concluded)
 P3P has its own statements a d data types
expressed in XML
 P3P schemas utilize XML schemas
 XML is a prerequisite to understanding P3P
 P3P specification released in January 2005 uses
catalog shopping example to explain concepts
 P3P is an International standard and is an ongoing
project
P3P and Legal Issues
 P3P does not replace laws
 P3P work together with the law
 What happens if the web sites do no honor their
P3P policies
Then appropriate legal actions will have to be
taken
 XML is the technology to specify P3P policies
 Policy experts will have to specify the policies
 Technologies will have to develop the
specifications
 Legal experts will have to take actions if the
policies are violated
-
Challenges and Discussion
 Technology alone is not sufficient for privacy
 We need technologists, Policy expert, Legal experts
and Social scientists to work on Privacy
 Some well known people have said ‘Forget about
privacy”
 Should we pursue working on Privacy?
- Interesting research problems
- Interdisciplinary research
- Something is better than nothing
- Try to prevent privacy violations
- If violations occur then prosecute
 Privacy is a major concern for Biometrics
Biometrics and Privacy
 How are Biometrics and Privacy Related?
 What are the major privacy concerns associated with Biometrics
Usage?
 What types of Biometric deployments require stronger protections
against privacy invasiveness
 What biometrics technologies are more susceptible to privacy-
invasive usage
 What types of protections are necessary to ensure that biometrics
are not use in a privacy invasive fashion
Relationship: Biometrics and Privacy
 Biometrics technology can be used without individual knowledge or
consent to link personal information from various sources, creating
individual profiles
 These profiles may be used for privacy invasive purposes such as
tracking movement
 Biometrics systems capable of being used in a privacy
compromising way are called privacy invasive systems
 Privacy neutral means that the technology cannot be used to protect
information nor undermine privacy
 Privacy sympathetic deployments include special designs to ensure
that biometrics data cannot be used in a privacy invasive fashion
 Privacy protection is about using biometric authentication to protect
other personal information (e.g., bank accounts)
HIPPA and Biometrics
 HIPPA (Health Insurance Portability and Accountability Act) refers to
biometrics
 Biometrics could be a potential identifier and as a result cause
privacy concerns and must be disassociated from medical
information
 Biometrics can be used for authentication and ensuring security
 HIPPA and P3P relationships
- Implementing HIPPA rules in P3P
Privacy Concerns Associated with Biometric
Deployments
 Informational privacy
- Unauthorized
collection, storage and usage of biometrics
information
 Personal Privacy
- Discomfort of people when encountering biometrics technology
 Privacy sympathetic qualities of biometrics technology
- E.g., not storing raw data
Informational Privacy
 Usage of biometric data is not usually the problem, potential
linkage, aggregation and misuse of personal information
associated with biometric data is the problem
 Unauthorized use of biometric technology
Conducting criminal forensic searches on drivers license
databases
- Using biometric data as a unique identifier
- Is biometric data personal information – debate in the
industry
 Unauthorized collection of biometric data
- E.g., Surveillance
 Unnecessary collection of biometric data
 Unauthorized disclosure
Sharing biometric data
-
-
Personal Privacy
 Many biometric technologies are offensive to certain individuals
especially when they are introduced
- Smartcards, Surveillance
 Unlike informational privacy, technology in general cannot help with
personal privacy
 Need psychologists and social scientists to work with individuals to
ensure comfort
 Legal procedures also should be in place in case privacy is violated
so that individuals are comfortable with the technology
 “Please excuse for intruding on your privacy”
Privacy Sympathetic Qualities of Biometric
Systems
 Most biometric systems (except forensic systems) do not store raw
data such as fingerprints or images
 Biometric data is stored in templates; templates consist of numbers;
cannot reconstruct biometric data from templates
 The idea of universal biometric identifier does not work as different
applications require different biometric technologies
 Different enrollments such as different samples also enhance
privacy
 Non interoperable biometrics technologies also help with privacy,
however difficult for different systems to interact without standards
Application Specific Privacy Risks
 Each deployment should address privacy concerns; also depends
on the technology used and how it is used; what are the steps taken,
what are the consequences of privacy violations
 BioPrivacy framework was developed in 2001 to help deployers
come up with risk ratings for their deployments
 Risk ratings depend on several factors such as verification vs.
identification
BioPrivacy Framework
 Overt vs. Covert
- Users being aware that biometric data is being collected has
less risk
 Opt-in vs. Mandatory
- Mandatory enrollment such as a public sector program has
higher risk
 Verification vs. Identification
- Searching a database to match a biometric (e.g., Identification)
has higher risk as individual’s biometric data may be collected
 Fixed duration vs. Indefinite duration
- Fixed duration has a negative impact
 Public sector vs. Private Sector
- Public sector deployments are more risky
BioPrivacy Framework (Concluded)
 User Role
- Citizen, Employee Traveler, Student, Customers, Individual
- E.g., Citizen may face more penalties for noncompliance
 User ownership vs. Institutional ownership
- User maintaining ownership of his/her biometric data is less
risky
 Personal storage vs. Storage in template database Is the data stored
in central database or in a user’s PC
- Central database is more risky
 Behavioral vs. Physiological Storage
- Physiological biometrics may be compromised more
 Template storage vs. Identifiable Storage
- Template storage is less risky
Risk Ratings
 For each biometric technology, rate risk with respect to the
BioPrivacy framework
 Example: Over/Covert risk is
- Moderate for Finger Scan
- High for face scan
- Low for Iris Scan
- Low for Retina Scan
- High for Voice scan
- Low for signature scan
- Moderate for Keystroke scan
- Low for hand scan
 Based on individual risk ratings compute an overall risk rating:
example, High for facial scan, Moderate for Iris scan and Low for
hand scan
Biometrics for Private Data Sharing?
Data/Policy for Federation
Export
Data/Policy
Export
Data/Policy
Export
Data/Policy
Component
Data/Policy for
Agency A
Component
Data/Policy for
Agency C
Component
Data/Policy for
Agency B