Snake oil (cryptography): Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Sandy Harris
No edit summary
imported>Sandy Harris
(add citation)
Line 7: Line 7:
Cryptography in particular and security in general are tricky because you get no direct feedback. If your word processor fails or your bank's web site goes down, you see the results. If your cryptosystem fails, you may not know. If your bank's cryptosystem fails, they may not know, and may not tell you if they do. In a famous example, the British [[Ultra]] project at [[Bletchley Park]] read many German codes through most of World War II, and the Germans never realised it.
Cryptography in particular and security in general are tricky because you get no direct feedback. If your word processor fails or your bank's web site goes down, you see the results. If your cryptosystem fails, you may not know. If your bank's cryptosystem fails, they may not know, and may not tell you if they do. In a famous example, the British [[Ultra]] project at [[Bletchley Park]] read many German codes through most of World War II, and the Germans never realised it.


The classic paper on the design issues is by  Anderson <ref>{{cite paper|author=Ross Anderson|title=Why Cryptosystems Fail|url=http://www.cl.cam.ac.uk/~rja14/Papers/wcf.html}}</ref>. Other references are papers by Schneier <ref>{{cite paper|author=Bruce Schneier|title=Why Cryptography Is Harder Than It Looks|url=http://www.schneier.com/essay-037.html}}</ref> and Gutmann<ref>{{cite paper|author=Peter Gutmann|title=Lessons Learned in Implementing and Deploying Crypto Software|date=2002|url=http://www.cs.auckland.ac.nz/~pgut001/pubs/usenix02.pdf}}</ref> and a book by Anderson <ref>{{cite book|author=Ross Anderson|title=Security Engineering|url=http://www.cl.cam.ac.uk/~rja14/book.html}}</ref>.
The classic paper on the design issues is by  Anderson <ref>{{cite paper|author=Ross Anderson|title=Why Cryptosystems Fail|url=http://www.cl.cam.ac.uk/~rja14/Papers/wcf.html}}</ref>. Other references are papers by Schneier <ref>{{cite paper|author=Bruce Schneier|title=Why Cryptography Is Harder Than It Looks|url=http://www.schneier.com/essay-037.html}}</ref><ref>{{cite|author=Bruce Schneier|title=Crypto-Gram Newsletter|date=Frebruary 1999|urlhttp://www.schneier.com/crypto-gram-9902.html#snakeoil}}</ref> and Gutmann<ref>{{cite paper|author=Peter Gutmann|title=Lessons Learned in Implementing and Deploying Crypto Software|date=2002|url=http://www.cs.auckland.ac.nz/~pgut001/pubs/usenix02.pdf}}</ref> and a book by Anderson <ref>{{cite book|author=Ross Anderson|title=Security Engineering|url=http://www.cl.cam.ac.uk/~rja14/book.html}}</ref>.


Then there is the incurable optimism of programmers. As for databases and real-time programming, cryptography looks deceptively simple. Almost any programmer can handle the basics &mdash; implement something that copes with the simple cases &mdash;  fairly easily. However, as in the other areas, almost anyone who tackles difficult cases without both some study of relevant theory and considerable practical experience is ''almost certain to get it horribly wrong''. This is demonstrated far too often. For example, almost every company that implements their own crypto as part of a product ends up with something that is easily broken. Well-known examples include the addition of encryption to products like [[Microsoft Office]] <ref>{{cite paper|author=Hongjun Wu|title=The Misuse of RC4 in Microsoft Word and Excel|url=http://eprint.iacr.org/2005/007}}</ref> and many others. There are also failures in products where encryption is central to the design, such as the [[CSS]] encryption on [[DVD]]s <ref>{{cite paper|author=David Touretsky|title=Gallery of CSS Descramblers|url=http://www-2.cs.cmu.edu/~dst/DeCSS/Gallery/}}</ref> and the [[WEP]] <ref>{{cite paper|author=Nikita Borisov, Ian Goldberg, and David Wagner|title=Security of the WEP algorithm|url=http://www.isaac.cs.berkeley.edu/isaac/wep-faq.html}}</ref> encryption in wireless networking. The programmers on these product teams are competent, but they routinely get this wrong.
Then there is the incurable optimism of programmers. As for databases and real-time programming, cryptography looks deceptively simple. Almost any programmer can handle the basics &mdash; implement something that copes with the simple cases &mdash;  fairly easily. However, as in the other areas, almost anyone who tackles difficult cases without both some study of relevant theory and considerable practical experience is ''almost certain to get it horribly wrong''. This is demonstrated far too often. For example, almost every company that implements their own crypto as part of a product ends up with something that is easily broken. Well-known examples include the addition of encryption to products like [[Microsoft Office]] <ref>{{cite paper|author=Hongjun Wu|title=The Misuse of RC4 in Microsoft Word and Excel|url=http://eprint.iacr.org/2005/007}}</ref> and many others. There are also failures in products where encryption is central to the design, such as the [[CSS]] encryption on [[DVD]]s <ref>{{cite paper|author=David Touretsky|title=Gallery of CSS Descramblers|url=http://www-2.cs.cmu.edu/~dst/DeCSS/Gallery/}}</ref> and the [[WEP]] <ref>{{cite paper|author=Nikita Borisov, Ian Goldberg, and David Wagner|title=Security of the WEP algorithm|url=http://www.isaac.cs.berkeley.edu/isaac/wep-faq.html}}</ref> encryption in wireless networking. The programmers on these product teams are competent, but they routinely get this wrong.

Revision as of 12:27, 8 August 2008

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

In Cryptography, the term snake oil is often used to refer to various products which do not offer anything like the security their marketing claims.

This is, regrettably, remarkably common; the reasons are rather varied. As in any field, marketers exaggerate. Many purchasers do not know enough to evaluate a cryptosystem. Even experts in other technical areas often do not know this stuff.

Cryptography in particular and security in general are tricky because you get no direct feedback. If your word processor fails or your bank's web site goes down, you see the results. If your cryptosystem fails, you may not know. If your bank's cryptosystem fails, they may not know, and may not tell you if they do. In a famous example, the British Ultra project at Bletchley Park read many German codes through most of World War II, and the Germans never realised it.

The classic paper on the design issues is by Anderson [1]. Other references are papers by Schneier [2][3] and Gutmann[4] and a book by Anderson [5].

Then there is the incurable optimism of programmers. As for databases and real-time programming, cryptography looks deceptively simple. Almost any programmer can handle the basics — implement something that copes with the simple cases — fairly easily. However, as in the other areas, almost anyone who tackles difficult cases without both some study of relevant theory and considerable practical experience is almost certain to get it horribly wrong. This is demonstrated far too often. For example, almost every company that implements their own crypto as part of a product ends up with something that is easily broken. Well-known examples include the addition of encryption to products like Microsoft Office [6] and many others. There are also failures in products where encryption is central to the design, such as the CSS encryption on DVDs [7] and the WEP [8] encryption in wireless networking. The programmers on these product teams are competent, but they routinely get this wrong.

Warning signs

A few things are warning signs that a product is bogus, or at least should be treated as suspect. We cover only the most conspicuous here; for more complete lists see the references.

One indicator is extravagant claims: "unbreakable", "revolutionary", "military-grade". "hacker-proof", "breakthrough".

Another indicator is a lack of technical details or references to research literature. This violates Kerckhoffs' Principle; no algorithm can be trusted until it has been published and analysed. If a vendor does not reveal all the internal details of their system so that it can be analysed, then they do not know what they are doing; assume their product is worthless. Any reason they give for not revealing the internals can be ignored. The only exception would be a large government agency who have their own analysts. Even they might get it wrong; Matt Blaze found a flaw [1] in the NSA's Clipper chip within weeks of its internals becoming public.

References to one-time pads. Real one-time pads are provably unbreakable for certain attacks, but snake oil often claims unbreakability for things that are not actually one-time pads. There is some current research suggesting that certain techniques may offer equivalent security, but if the claim "just like a one-time pad" is made without reference to the specific research, one may be well-advised to look for a snake charmer.

External links

  • Matt Curtin's Snake Oil FAQ [2] is the commonest reference.

References

  1. Ross Anderson. Why Cryptosystems Fail.
  2. Bruce Schneier. Why Cryptography Is Harder Than It Looks.
  3. Bruce Schneier. Crypto-Gram Newsletter. [[{{{publisher}}}]], copyright Frebruary 1999.

  4. Peter Gutmann (2002). Lessons Learned in Implementing and Deploying Crypto Software.
  5. Ross Anderson. Security Engineering. 
  6. Hongjun Wu. The Misuse of RC4 in Microsoft Word and Excel.
  7. David Touretsky. Gallery of CSS Descramblers.
  8. Nikita Borisov, Ian Goldberg, and David Wagner. Security of the WEP algorithm.