Snake oil (cryptography): Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Sandy Harris
No edit summary
imported>Sandy Harris
No edit summary
Line 1: Line 1:
In [[Cryptography]], the term "snake oil" is often used to refer to various products which do not offer anything like the security their marketing claims.
In [[Cryptography]], the term "snake oil" is often used to refer to various products which do not offer anything like the security their marketing claims.


This is, regrettably, remarkably common. The reasons are rather varied:
This is, regrettably, remarkably common; the reasons are rather varied. As in any field, marketers exaggerate. Many purchasers do not know enough to evaluate a cryptosystem. Even experts in other technical areas often do not know this stuff.
* As in any field, marketers exaggerate.
 
* Cryptography in particular and security in general are tricky because you get no direct feedback. If your word processor fails or your bank's web site goes down, you see the results. If your cryptosystem fails, you may not know. If your bank's cryptosystem fails, they may not tell you.
Cryptography in particular and security in general are tricky because you get no direct feedback. If your word processor fails or your bank's web site goes down, you see the results. If your cryptosystem fails, you may not know. If your bank's cryptosystem fails, they may not know, and may not tell you if they do. In a famous example, the British [[Ultra]] project at [[Bletchley Park]] read many German codes through most of World War II, and the Germans never realised it. Some of the design issues are discussed in detail in [[Ross Anderson]]'s "Security Engineering" <ref>{{cite book|author=Ross Anderson|title=Security Engineering|url=http://www.cl.cam.ac.uk/~rja14/book.html}}</ref>.
** In a famous example, the British [[Ultra]] project at [[Bletchley Park]] read many German codes through most of World War II, and the Germans never realised it.
 
* Then there is the incurable optimism of programmers. As for databases and real-time programming, cryptography looks deceptively simple. Almost any programmer can handle the basics &mdash; implement something that copes with the simple cases &mdash;  fairly easily. However, as in the other areas, almost anyone who tackles difficult cases without both some study of relevant theory and considerable practical experience is ''almost certain to get it horribly wrong''. This is demonstrated far too often.
Then there is the incurable optimism of programmers. As for databases and real-time programming, cryptography looks deceptively simple. Almost any programmer can handle the basics &mdash; implement something that copes with the simple cases &mdash;  fairly easily. However, as in the other areas, almost anyone who tackles difficult cases without both some study of relevant theory and considerable practical experience is ''almost certain to get it horribly wrong''. This is demonstrated far too often. For example, almost every company that implements their own crypto as part of a product ends up with something that is easily broken; [[Microsoft Word]] and [[Adobe]] [[PDF]] encryption are the best-known examples, but there are dozens of others. The programmers on their product teams are competent, but they routinely get this wrong.
** For example, almost every company that implements their own crypto as part of a product ends up with something that is easily broken; [[Microsoft Word]] and [[Adobe]] [[PDF]] encryption are the best-known examples, but there are dozens of others. The programmers on their product teams are competent, but they routinely get this wrong.
 
* Many purchasers do not know enough to evaluate a cryptosystem. Even experts in other technical areas often do not know this stuff.
Some of these issues are discussed in detail in [[Ross Anderson]]'s book "Security Engineering". See his home page [http://www.cl.cam.ac.uk/~rja14/book.html] for information on the second edition and complete text of the first.
==Warning signs==
==Warning signs==
A few things are warning signs that a product is bogus, or at least should be treated as suspect. We cover only the most conspicuous here; for more complete lists see the references.
A few things are warning signs that a product is bogus, or at least should be treated as suspect. We cover only the most conspicuous here; for more complete lists see the references.

Revision as of 13:15, 4 August 2008

In Cryptography, the term "snake oil" is often used to refer to various products which do not offer anything like the security their marketing claims.

This is, regrettably, remarkably common; the reasons are rather varied. As in any field, marketers exaggerate. Many purchasers do not know enough to evaluate a cryptosystem. Even experts in other technical areas often do not know this stuff.

Cryptography in particular and security in general are tricky because you get no direct feedback. If your word processor fails or your bank's web site goes down, you see the results. If your cryptosystem fails, you may not know. If your bank's cryptosystem fails, they may not know, and may not tell you if they do. In a famous example, the British Ultra project at Bletchley Park read many German codes through most of World War II, and the Germans never realised it. Some of the design issues are discussed in detail in Ross Anderson's "Security Engineering" [1].

Then there is the incurable optimism of programmers. As for databases and real-time programming, cryptography looks deceptively simple. Almost any programmer can handle the basics — implement something that copes with the simple cases — fairly easily. However, as in the other areas, almost anyone who tackles difficult cases without both some study of relevant theory and considerable practical experience is almost certain to get it horribly wrong. This is demonstrated far too often. For example, almost every company that implements their own crypto as part of a product ends up with something that is easily broken; Microsoft Word and Adobe PDF encryption are the best-known examples, but there are dozens of others. The programmers on their product teams are competent, but they routinely get this wrong.

Warning signs

A few things are warning signs that a product is bogus, or at least should be treated as suspect. We cover only the most conspicuous here; for more complete lists see the references.

  • Generally extravagant claims — "unbreakable", "revolutionary", "military-grade". "hacker-proof" ...
  • Violations of Kerckhoffs' Principle. If a vendor does not reveal all the internal details of their system so that it can be analysed, then they do not know what they are doing; assume their product is worthless. Any reason they give for not revealing the internals can be ignored.
    • The only exception would be a large government agency who have their own analysts. Even they might get it wrong; Matt Blaze found a flaw [1] in the NSA's Clipper chip withing weeks of its internals becoming public.
  • References to one-time pads

External links

  • Matt Curtin's Snake Oil FAQ [2] is the commonest reference.
  1. Ross Anderson. Security Engineering.