concerns about data linkability
Download
Report
Transcript concerns about data linkability
LinkedOn:
Concerns About Data Linkability
Mireille Hildebrandt
http://works.bepress.com/mireille_hildebrandt/
March 31, 2016
2
3/31/2016
3
Agenda:
1. Transnational data flows
2. The computational turn: from data to
‘knowledge’
3. Traditional approaches: control, anonymity,
data minimization
4. Alternative or complementary approach:
Ambient Law
3/31/2016
4
Transnational data flows
• Personal data PII, relational data,
other data
• SWIFT, PNRs, SNS, search engine
logs (Google), cloud computing, eCommerce, EINs, travel/transport,
ehealth, mhealth
• De- and recontextualization
3/31/2016
5
Computational turn:
from data to knowledge
3/31/2016
6
Primacy of Connectivity
• Turning data into information,
generating knowledge
• Connectivity makes devices and
infrastructures smart
• Linkability is where the money is
(targeted advertising and servicing,
price discrimination)
• Connectivity: context-awareness and
pattern-recognition
3/31/2016
7
Primacy of Data Science
• ‘combination of the software
programming, statistics and
storytelling/art that explains the
nuggets of gold hidden under
mountains of data’
(Economist 27th February 2010)
3/31/2016
8
Computational Turn:
• In the sciences, e-commerce, public
security, critical infrastructure,
employment, healthcare
• The computational vulcano ashcloud: real or constructed?
• ‘Thomas-theorem’: if machines define
situations as real, they are real in
their consequences?
3/31/2016
9
Knowledge discovery in data
bases (KDD):
• interception
• storage
• aggregation
• analysis – data mining
• interpretation
• application
3/31/2016
10
Traditional approaches:
control, anonymity,
data minimisation
3/31/2016
11
Anonymity & KDD:
• KDD often involves the mining of
anonymized aggregated data; data
protection directive NOT
applicable
• Anonymity offers limited
protection against power of KDD
3/31/2016
12
Linkability (correlatability):
• Linking data pertaining to one
individual across different contexts
[eg deanonymizing data; violating
contextual integrity]
• Linking personal or other data to
computational knowledge [eg
targeted servicing; proactive
environments]
3/31/2016
13
Specific Threats of Linkability:
• Undesirable identification
• Being anticipated as a specific type of
person, prediction of lifestyle,
earning capacity, mobility, interests,
desires, recidivism; prone to use
violence or to have a depression, to
engage in terrorist activity, to spend
in specific ways
3/31/2016
14
‘When your carpet calls your doctor’
The Economist – April 10, 2010
Trust after
the computational turn:
• No inkling of who knows what about
a person: hidden complexity
• Need for a novel type of literacy in
the age of computation
• ‘Reading’ computational patterns
3/31/2016
16
Specific Threats of Linkability:
• Privacy (‘freedom from unreasonable
constraints on the building of one’s
identity’ A&R)
• Freedom from unjustified
discrimination (incorrect, unfair)
• Due Process (no way to anticipate or
contest how one is anticipated)
3/31/2016
17
Alternative and complementary
approach: Ambient Law
3/31/2016
18
Ambient Law
• Articulation of legal protection
into the smart infrastructure
• Democratic legislator should create
the incentive structure by codesigning the architecture
• Contestability of automated
decisions must be built-in
3/31/2016
19
Ambient law:
• ‘Peircing the Computational
Veil’
• Transparency Enhancing Tools
(TETs)
• ‘Auditability’ = contestability
• Feedback: intuitive interfaces
• Smart data minimisation
3/31/2016
20
Data Protection Directive
D 95/46/EC
Art. 12 [and 15]
Right of access
• MS shall guarantee every data subject the
right to obtain from the controller:
• Knowledge of the logic involved in any
automatic processing of data concerning
him at least in the case of the automated
decisions referred to in art. 15 (1)
3/31/2016
21
Data Protection Directive
D 95/46/EC
Recital 41:
(…) every data subject must also have the
right to know the logic involved in the
automatic processing of data concerning
him, at least in the case of the automated
decisions referred to in Article 15 (1);
whereas this right must not adversely affect
trade secrets or intellectual property and in
particular the copyright protecting the
software; whereas these considerations
must not, however, result in the data
subject being refused all information
3/31/2016
22
Technical articulation:
• Type I = based on auditing; access to
or information about system from the
data controller/processor
• Type II =based on counterprofiling
•
•
•
•
3/31/2016
Inference machines
Simulation
Privacy mirrors
Playing around to guess how one is
targeted
23
Facilitate productive trust
• Create intuitive interfaces for transparency
enhancing tools (TETs) that provide the
substance of the transparency right of
art. 12
• This will enable smart instead of blind data
minimisation
• As well as smart data sharing:
• With whom, when, where, in which context
3/31/2016
24
Thank you for your attention
Any questions?
3/31/2016
25