Transcript ppt

Cryptography
Lecture 2
Stefan Dziembowski
www.dziembowski.net
[email protected]
Plan
1. Information-theoretic cryptography
2. Introduction to cryptography based on
the computational assumptions
3. Provable security
4. Pseudorandom generators
The scenario from the previous
lecture
Alice
Bob
Eve
Shannon’s theorem  perfect secrecy is
possible only if the key is as long as the plaintext
In real-life it is completely impractical
What to do?
Idea: limit the power of the adversary.
How?
Classical (computationally-secure) cryptography:
bound his computational power.
Alternative options exists
(but are not very practical)
Quantum cryptography
Stephen Wiesner (1970s), Charles H. Bennett and Gilles Brassard (1984)
quantum link
Alice
Bob
Quantum indeterminacy: quantum states cannot be
measured without disturbing the original state.
Eve
Hence Eve cannot read the bits in an unnoticeable way.
Quantum cryptography
Advantage:
security is based on the laws of quantum physics
Disadvantage:
needs a dedicated equipment.
Practicality?
Currently: successful transmissions for distances of length around 150 km.
Commercial products are available.
Warning:
Quantum cryptography should not be confused with quantum computing.
A satellite scenario
A third party (a satellite) is
broadcasting random bits.
000110100111010010011010111001110111
111010011101010101010010010100111100
001001111111100010101001000101010010
001010010100101011010101001010010101
Alice
Bob
Eve
Does it help?
No...
(Shannon’s theorem of course also
holds in this case.)
Ueli Maurer (1993): noisy channel.
1 0 1 0 1 0 0 1 1 0 1 0 0 1 0
1 0 1 0 1
0
0 0 0 1 1 0 0
1 0 0 1 0
1
1 0 1 0
1 1 0 0 1 1 0 1 0 0 1
0 1
0
1 0 1 0
1 1 0 0 1 1 0 1 0 0 1
0 0
some bits get flipped
(because of the noise)
Assumption: the data that the adversary receives is noisy.
(The data that Alice and Bob receive may be even more noisy.)
Bounded-Storage Model
Another idea: bound the size of adversary’s memory
000110100111010010011010111001110111
111010011101010101010010010100111100
001001111111100010101001000101010010
001010010100101011010101001010010101
too large to fit in Eve’s memory
Real (computationally-secure)
cryptography starts here:
Eve is computationally-bounded
But what does it mean?
Ideas:
1. She has can use at most 1000
Intel Core 2 Extreme X6800 Dual Core Processors
for at most 100 years...
2. She can buy equipment worth 1 million euro and use it for 30 years...
it’s hard to reason
formally about it
A better idea
”The adversary has access to a Turing Machine that can make at most 1030
steps.”
More generally, we could have definitions of a type:
“a system X is (t,ε)-secure if every Turing Machine
that operates in time t
can break it with probability at most ε.”
This would be quite precise, but...
We would need to specify exactly what we mean by a “Turing Machine”:
•
•
•
how many tapes it has?
how does it access these tapes (maybe a “random access memory”
is a more realistic model..)
...
Moreover, this approach often leads to ugly formulas...
“(t,ε)-security”
What to do?
Idea:
• t steps of a Turing Machine = “efficient computation”
• ε – a value “very close to zero”.
How to formalize it?
Use the asymptotics!
Efficiently computable?
“efficiently computable”
=
“polynomial-time computable
on a Turing Machine”
that is: running in time
O(nc) (for some c)
Here we assume that the Turing Machines are the right
model for the real-life computation.
Not true if a quantum computer is built...
Very small?
“very small”
=
“negligible”
=
approaches 0 faster than the inverse of any polynomial
Formally:
Negligible or not?
no
yes
yes
yes
yes
no
Security parameter
Typically, we will say that a scheme X is secure if
P (M breaks the scheme X) is negligible
A
polynomial-time
Turing Machine M
The terms “negligible” and “polynomial” make sense only if X (and the adversary)
take an additional input n called
a security parameter.
In other words: we consider an infinite sequence
X(1),X(2),...
of schemes.
Example
Consider the authentication scheme from the last week:
Nice properties of these notions
• A sum of two polynomials is a polynomial:
poly + poly = poly
• A product of two polynomials is a polynomial:
poly * poly = poly
• A sum of two negligible functions is a negligible function:
negl + negl = negl
Moreover:
• A negligible function multiplied by a polynomial is negligible
negl * poly = negl
A new definition of an encryption
scheme
Is this the right approach?
Advantages
1.
All types of Turing Machines are “equivalent” up to a
“polynomial reduction”.
Therefore we do need to specify the details of the model.
2.
The formulas get much simpler.
Disadvantage
Asymptotic results don’t tell us anything about security of the
concrete systems.
However
Usually one can prove formally an asymptotic result and then
argue informally that “the constants are reasonable”
(and can be calculated if one really wants).
Provable security
We want to construct schemes that are
provably secure.
But...
• why do we want to do it?
• how to define it?
• and is it possible to achieve it?
Provable security – the motivation
In many areas of computer science formal proofs are not essential.
For example, instead of proving that an algorithm is efficient, we can just
simulate it on a
“typical input”.
In cryptography it’s not true, because
there cannot exist an experimental proof that a scheme is secure.
Why?
Because a notion of a
“typical adversary”
does not make sense.
How did we define the perfect secrecy?
Idea 1
The adversary should not be able
to compute k.
Experiment (m – a message)
1.
the key k is chosen randomly
2.
message m is encrypted using k:
c := Enck(m)
3.
c is given to the adversary
Idea 2
The adversary should not be able to
compute m.
Idea 3
The adversary should not be able to
compute any information about m.
Idea 4
The adversary should not be able to compute any
additional information about m.
makes more sense
Idea
The adversary should not be able to compute any additional information about m.
Towards the definition of computational secrecy...
P(C = c) = P(C = c | M=m)
A A
m c
P(C = c | M = m0) = P(C = c | M = m1)
A
A
m0,m1 c
A
A
m0,m1 c
P(Enc(K,M) = c | M = m0) = P(Enc(K,M) = c | M = m1)
P(Enc(K,m0) = c | M = m0) = P(Enc(K,m1) = c | M = m1)
A
A
m0,m1 c
A
A
m0,m1 c
P(Enc(K,m0) = c) = P(Enc(K,m1) = c)
Indistinguishability
P(Enc(K,m0) = c) = P(Enc(K,m1) = c)
A
A
m0,m1 c
In other words: the distributions of Enc(K,m0) = Enc(K,m1) are identical
IDEA
change it to:
are indistinguishable by a
polynomial time adversary
A game
(Gen,Enc,Dec) – an encryption
scheme
security parameter
1n
adversary
(polynomial-time Turing machine)
chooses m0,m1 such that
|m0|=|m1|
has to guess b
oracle
m0,m1
c
1. selects k := G(1n)
2. chooses a random b = 0,1
3. calculates
c := Enc(k,mb)
Alternative name: semantially-secure
(sometimes we will say: “is computationally-secure”, if the context is clear)
Security definition:
We say that (Gen,Enc,Dec) has indistinguishable encryptions if any
polynomial time adversary guesses b correctly with probability at most 0.5 +
ε(n), where ε is negligible.
Testing the definition
1. Suppose the adversary can compute k from
some Enc(k,m). Can he win the game?
YES!
2. Suppose the adversary can compute some bit
of m from Enc(k,m). Can he win the game?
YES!
Is it possible to prove security?
(Gen,Enc,Dec) -- an encryption scheme.
For simplicity suppose that:
1. for a security parameter n the key is of length n.
2. Enc is deterministic
Consider the following language:
Q: What if L is polynomial-time decidable?
A: Then the scheme is broken (exercise)
On the other hand: L is in NP. (k is the NP-witness)
Is it really
true?
So, if P = NP, then any semantically-secure encryption is broken.
“If P=NP, then the semantically-secure
encryption is broken”
Is it 100% true?
Not really...
This is because even if P=NP we do not
know what are the constants.
Maybe P=NP in a very “inefficient way”...
In any case, to prove security of a cryptographic scheme we would need
to show
a lower bound on the computational complexity of some problem.
In the “asymptotic setting” that would mean that
at least
we show that P ≠ NP.
Does the implication in the other direction hold?
(that is: does P ≠ NP imply anything for cryptography?)
No! (at least as far as we know)
Intuitively: because NP is a notion from the “worst case complexity”, and
cryptography concerns the “average case complexity”.
Therefore
proving that an encryption scheme is secure is probably much harder
than proving that P ≠ NP.
What can we prove?
We can prove conditional results.
That is, we can show theorems of a type:
Suppose that some
“computational
assumption A”
holds
Suppose that some
scheme Y is secure
then scheme X is
secure.
then scheme X is
secure.
Research program in cryptography
Base the security of cryptographic schemes on a small
number of well-specified “computational assumptions”.
Examples of A:
“decisional Diffie-Hellman assumption”
“strong RSA assumption”
Some “computational
assumption A”
holds
then scheme X is
secure.
in this we
have to
“believe”
the rest is
provable
Example
Suppose that some
“computational
assumption A”
holds
We are now going to
show an example
of such reasoning:
Suppose that G is a
“cryptographic
pseudorandom
generator”
then scheme X is
secure.
we G can construct a
secure encryption
scheme
Pseudorandom generators
s
G(s)
If we use a “normal PRG” – this idea doesn’t work (exercise).
It works only with the cryptographic PRGs.
“Looks random”
What does it mean?
Non-cryptographic applications:
should pass some statistical tests.
Cryptography:
should pass all polynomial-time tests.
Cryptographic PRG
outputs:
0 if he thinks it’s R
a random string R
or
1 if he thinks it’s G(S)
G(S) (where S random)
a polynomial-time
distinguisher D
Should not be able to
distinguish...
Constructions
There exists constructions of cryptographic
pseudorandom-generators, that are
conjectured to be secure.
Some of them are extremely efficient, and
widely used in practice.
They are called the “stream ciphers” (we
will discuss them later).
Theorem
If G is a cryptographic PRG then the encryption
scheme constructed before is semanticallysecure (i.e. it has indistinguishable encryptions).
cryptographic
PRGs
computationally-secure
encryption
Proof (sketch)
Suppose that it is not secure.
Therefore there exists an adversary that wins the “guessing game”
with probability 0.5 + δ(n), where δ(n) is not negligible.
simulates
X
chooses m0,m1
m0,m1
has to guess b
c
1.
2.
b = 0,1 random
c := x xor mb
If the adversary guessed b correctly then
output 1: “x is pseudorandom”.
Otherwise output 0: “x is random”.
x is a random string R
the adversary guesses b correctly
with probability 0.5
prob. 0.5
outputs:
1
prob. 0.5
0
x = G(S)
the adversary guesses b correctly
with probability 0.5 + δ(n)
prob.
0.5 + δ(n)
1
prob.
0.5 - δ(n)
0
QED
Moral
cryptographic
PRGs
semantically-secure
encryption
To construct secure encryption it suffices to
construct a secure PRG.
Outlook
Cryptography
• one time pad,
• quantum cryptography,
• “the satellite scenario”
“computationally-secure”
often called:
• “information-theoretic”,
• “unconditional”
1. some problems are
computationally difficult
2. our understanding of what
“computational difficulty”
means is correct.
based on 2 assumptions: