Register article on using non-randomness of encrypted file content to reduce time needed to decrypt by brute force

Peter Fairbrother zenadsl6186 at zen.co.uk
Fri Aug 16 20:00:26 BST 2013


On 16/08/13 15:20, Ian Mason wrote:
>
> On 15 Aug 2013, at 16:00, Igor Mozolevsky wrote:
>
>> On 15 August 2013 11:00, Brian Morrison <bdm at fenrir.org.uk> wrote:
>>>
>>> Not seen this mentioned anywhere else yet:
>>>
>>> http://www.theregister.co.uk/2013/08/14/research_shakes_crypto_foundations/
>>>
>>>
>>> Any opinions from those with direct knowledge of such techniques?
>>
>>
>> Isn't the conventional wisdom to compress before encrypting to prevent
>> thing like that?


No.

Conventional modern wisdom is to trust the cipher, which has been 
designed so that it is indistinguishable under partial or even full 
known plaintext attack (and adaptive chosen plaintext attack, and chosen 
ciphertext attack, and ...).

If the attacker can't guess anything about the key, that should be 
enough. So keys should be chosen at random, and securely held. That may 
not be easy ..


> "Conventional wisdom" - yes, actual wisdom, no. The compression layer in
> SSL has been used to attack it (http://breachattack.com/).
>
> The original article is nothing new - it's just a (partial) known
> plaintext attack.

???

Any predictability in the underlying plaintext of a
> cyphertext gives you a handle to attack the cypher with.


With a modern cipher the handle should be so slippery it gives you no 
traction at all.

> Most WW2
> cryptanalysis was done around known or likely (partial) plaintexts.

Yes - but that for WW2 ciphers, not modern ones.

> The
> moral of the story is, never send anything predictable. If your message
> is predictable mix something genuinely random into it.

There is no longer any need to avoid sending predictable messages.

Except if it's predictable, why bother sending it?  :)


-- Peter Fairbrother




>
>>
>>
>> --
>> Igor M.
>>
>
>
>




More information about the ukcrypto mailing list