Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Spurious Keys and Unicity Distance Application of Entropy - Slides | CS 349, Study notes of Computer Science

Material Type: Notes; Class: Tpc: Games; Subject: Computer Science; University: Wellesley College; Term: Unknown 1989;

Typology: Study notes

Pre 2010

Uploaded on 08/18/2009

koofers-user-5di
koofers-user-5di 🇺🇸

4

(1)

10 documents

1 / 8

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
1
CS349 Cryptography
Department of Computer Science
Wellesley College
Spurious keys and unicity distance
Application of entropy
Unicity distance 10-2
Key equivocation
oWe apply the results from
last lecture to establish
relationships between the
entropies of components
of a cryptosystem.
oThe conditional entropy
H(K|C) is called the
key
equivocation
, and is a
measure of how much
information about the key
is revealed by the
ciphertext.
pf3
pf4
pf5
pf8

Partial preview of the text

Download Spurious Keys and Unicity Distance Application of Entropy - Slides | CS 349 and more Study notes Computer Science in PDF only on Docsity!

CS349 Cryptography

Department of Computer Science Wellesley College

Spurious keys and unicity distance

Application of entropy

Unicity distance 10- 2

Key equivocation

o We apply the results from last lecture to establish relationships between the entropies of components of a cryptosystem. o The conditional entropy H( K | C ) is called the key equivocation, and is a measure of how much information about the key is revealed by the ciphertext.

Unicity distance 10- 3 Remember me? Theorem. H( X , Y ) = H( Y ) + H( X | Y ). Corollary. H( X | Y ) ≤ H( X ), with equality if and only if X and Y are independent. Unicity distance 10- 4 H( K | C ) = H( K ) + H( P ) - H( C ) Observe that H( K , P , C ) = H( C | K , P ) + H( K , P ). Now, the key and plaintext uniquely determine the ciphertext. Thus, H( C | K , P ) = 0, and H( K , P , C ) = H( K , P ) = H( K ) + H( P ).* In a similar fashion, H( K , P , C ) = H( K , C ). Putting this all together, H( K | C ) = H K , C ) - H( C ) = H( K , P , C ) - H( C ) = H( K ) + H( P ) - H( C ). *The latter equality since K and P are independent.

Unicity distance 10- 7 Bounding spurious keys o Our goal is to prove a bound on the expected number of spurious keys. o To do so, we seek a definition of the entropy (per letter) of a natural language L, denoted H L. o HL should be a measure of the average information per letter in a “meaningful” string of plaintext. Unicity distance 10- 8 Entropy of a language L Suppose L is a natural language. The entropy of L is defined to be the quantity and the redundancy of L is defined to be † HL = lim n Æ• H ( P n ) nRL = 1 - HL log 2 | P |

Unicity distance 10- 9 Redundancy o Empirical studies estimate that for the English language, 1.0 ≤ HL ≤ 1.5, that is, the average information content in English is about one and a half bits per letter. o Using 1.25 as our estimate of HL gives a redundancy of about 0.75. In other words, English is 75% redundant. Unicity distance 10- 10 n-grams of plain and cipher text o Given probability distributions on K and Pn, we can define the induced probability distribution on Cn. o Define P n^ and C n^ to be the random variables representing n-grams of plain and cipher text respectively.

Unicity distance 10- 13

Key equivocation and spurious keys

H ( K | C n^ ) = Pr[ y ] H ( K | y )

y Œ C n

Â

£ Pr[ y ]log 2 | K ( y ) |

y Œ C n

Â

£ log 2 Pr[ y ] | K ( y ) |

y Œ C n

Â

= log 2 ( sn + 1 ).

Next, we relate H( K | C n) to the number of spurious keys. *Jensen’s inequality snuck into the computation. Where? Unicity distance 10- 14

Combining the two inequalities

log 2 ( sn + 1 ) ≥ H ( K | C

n

) ≥ H ( K ) - nRL log 2 | P |

We have In the case where keys are chosen equiprobably H( K ) = log 2 |K| and we have or *Which maximizes H( K ).

log 2 ( sn + 1 ) ≥ log 2 | K | - nRL log 2 | P |

= log 2

| K |

| P | nRL

sn ≥

| K |

| P |

nRL^ -^1

Unicity distance 10- 15 A bound for spurious keys Suppose (P, C, K, E, D) is a crpytosystem where |C| = |P| and keys are chosen equiprobably. Let RL denote the redundancy of the underlying language. Then given a string of ciphertext of length n, where n is sufficiently large, the expected number of spurious keys satisfies † sn ≥ | K | | P | nRL^ -^1 Unicity distance 10- 16 Unicity distance o The unicity distance of a cryptosystem is defined to be the value of n, denoted by n 0 , at which the expected number of spurious keys becomes zero, that is, the average number of ciphertext required for an opponent to be able to uniquely compute the key, given enough computing time. o Setting sn to zero in and solving for n gives a estimate for the unicity distance † sn ≥ | K | | P | nRL

  • 1 † n 0 ª log 2 | K | RL log 2 | P |