Politics of cryptography

A number of legal and political issues arise in connection with cryptography. Government regulations controlling its use or export are passionately debated. The legal status of digital signatures can be an issue, and cryptographic techniques may affect the acceptability of computer data as evidence. Access to data can be an issue: can a warrant or a tax auditor force someone to decrypt data, or even to turn over the key? Is encrypting a laptop hard drive when traveling just a reasonable precaution, or is it reason for a border control officer to become suspicious?

Encryption used for access control also raises legal issues. Should it be illegal for a user to defeat the encryption on a DVD? Or for a movie cartel to manipulate the market using encryption in an attempt to force Australians to pay higher prices because US DVDs will not play on their machines?

Diffie and Landau "Privacy on the Line: The Politics of Wiretapping and Encryption" covers the history from the First World War to the 1990s, with an emphasis on the US. Steven Levy "Crypto: How the Code rebels Beat the Government &mdash; Saving Privacy in the Digital Age" focuses on the "crypto wars", the debates of the 90s. The "Code Rebels" in the title are better known as the cypherpunks.

Active advocacy groups in this area include the Electronic Frontier Foundation or EFF, Electronic Privacy Information Center or EPIC and Global Internet Liberty Campaign or GILC.

Two online surveys cover crypto laws around the world, one for usage and export restrictions and one for digital signatures.

Prohibitions
Because of its potential to assist the malicious in their schemes, cryptography has long been of interest to intelligence gathering agencies and law enforcement agencies. Because of its facilitation of privacy and free speech, cryptography is also of considerable interest to civil rights supporters. Accordingly, there has been a history of legal and political controversy around cryptography, especially since the advent of inexpensive computers has made possible widespread access to high quality cryptography.

In some countries, even the domestic use of cryptography is, or has been, restricted. Until 1999, France significantly restricted the use of cryptography domestically. In China, a license is still required to use cryptography. Many countries have tight restrictions on the use of cryptography. Among the more restrictive are laws in Belarus, China, Kazakhstan, Mongolia, Pakistan, Russia, Singapore, Tunisia, Venezuela, and Vietnam.

In the United States and most other Western countries, cryptography is legal for domestic use, but there has been much conflict over legal issues related to cryptography. One particularly important issue has been the export of cryptography and cryptographic software and hardware. See the next section.

Export Controls
Because of the importance of cryptanalysis in World War II and an expectation that cryptography would continue to be important for national security, many western governments have, at some point, strictly regulated export of cryptography. After World War II, it was illegal in the US to sell or distribute encryption technology overseas; in fact, encryption was classified as a munition, like tanks and nuclear weapons. Until the advent of the personal computer and the Internet, this was not especially problematic as as good cryptography was indistinguishable from bad cryptography for nearly all users, and because most of the cryptographic techniques generally available were slow and error prone whether good or bad. However, as the Internet grew and computers became more widely available, high quality encryption techniques became well-known around the globe. As a result, export controls came to be understood to be an impediment to commerce and to research.

In 1996, thirty-nine countries signed the Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies, an arms control treaty that deals with the export of arms and "dual-use" (civilian and military) technologies such as cryptography. The treaty stipulated that the use of cryptography with short key-lengths (56-bit for symmetric encryption, 512-bit for RSA) would no longer be export-controlled.

In the 1990s, several challenges were launched against US regulations for export of cryptography. Philip Zimmermann's Pretty Good Privacy (PGP) encryption program, as well as its source code, was released in the US, and found its way onto the Internet in June of 1991. After a complaint by RSA Security (then called RSA Data Security, Inc., or RSADSI), Zimmermann was criminally investigated by the Customs Service and the FBI for several years but no charges were filed.

Also, Daniel Bernstein, then a graduate student at University of California at Berkeley, brought a lawsuit against the US government challenging the export restrictions as unconstitutional. The argument is that code is speech, programmers use it to communicate with each other as well as with the computer, so it is entitled to the free speech protection of the 1st Amendment to the United States Constitution. The 1995 case was Bernstein v. United States, and was supported by EFF. Bernstein won both in the trial court and on appeal. The courts ruled that source code for cryptographic algorithms and systems was protected as free speech by the Constitution. There is an archive of case documents.

Cryptography exports from the US are now much less strictly regulated than in the past as a consequence of a major relaxation in 2000. This regulation change was seen by many on the other side as a government attempt to get off the hook in the Bernstein case, to avoid losing in the Supreme Court and having their regulations completely overturned. There are no longer many restrictions on key sizes in US-exported mass-market software or open source software. However, notifying the Bureau of Export Administration is still required; the EFF says that requirement is unconstitutional as well.

In practice today, since the relaxation in US export restrictions, and because almost every personal computer connected to the Internet, everywhere in the world, includes a US-sourced web browser such as Mozilla Firefox or Microsoft Internet Explorer, almost every Internet user worldwide has strong cryptography (i.e., using long keys) in their browser's Transport Layer Security or Secure Sockets Layer stack. The Mozilla Thunderbird and Microsoft Outlook E-mail client programs similarly can connect to Internet Message Access Protocol or  Post Office Protocol(POP) servers via TLS, and can send and receive email encrypted with S/MIME.

Many Internet users don't realize that their basic application software contains such extensive cryptography systems. These browsers and email programs are so ubiquitous that even governments whose intent is to regulate civilian use of cryptography generally don't find it practical to do much to control distribution or use of this quality of cryptography, so even when such laws are in force, actual enforcement is often lax.

NSA involvement
Another contentious issue connected to cryptography in the United States, is influence of the National Security Agency in high quality cipher development and policy. NSA was involved with the design of DES during its development at IBM and its consideration by the National Bureau of Standards as a possible Federal Standard for cryptography. DES was designed to be secure against differential cryptanalysis, a powerful and general cryptanalytic technique known to NSA and IBM, that became publicly known only when it was rediscovered in the late 1980s. According to Steven Levy, IBM discovered differential cryptanalysis and kept the technique secret at NSA's request.

Another instance of NSA's involvement was the 1993 Clipper chip affair, an encryption microchip intended to be part of the Capstone cryptography-control initiative. Clipper was widely criticized for two cryptographic reasons: the cipher algorithm was classified (the cipher, called Skipjack, was declassified in 1998 after the Clipper initiative lapsed), which led to concerns that NSA had deliberately made the cipher weak in order to assist its intelligence efforts. The whole initiative was also criticized based on its violation of Kerckhoffs' Principle, as the scheme included a special escrow key held by the government for use by law enforcement, for example in wiretaps. Also, Matt Blaze showed in 1994 that the protocol was flawed and easily subverted.

Digital rights management
Cryptography is central to Digital Rights Management (DRM), a group of techniques for technologically controlling use of copyrighted material, being widely implemented and deployed at the behest of some copyright holders.

In 1998, Bill Clinton signed the Digital Millennium Copyright Act (DMCA), which criminalized the production, dissemination, and use of certain cryptanalytic techniques and technology; specifically, those that could be used to circumvent DRM technological schemes. This had a very serious potential impact on the cryptography research community since an argument can be made that virtually any cryptanalytic research violated, or might violate, the DMCA. The FBI has not enforced the DMCA as rigorously as had been feared by some, but the law, nonetheless, remains a controversial one. One well-respected cryptography researcher, Niels Ferguson, has publicly stated that he will not release some research into an Intel security design for fear of prosecution under the DMCA, and both Alan Cox (longtime number two in Linux kernel development) and Professor Edward Felten (and some of his students at Princeton) have encountered problems related to the Act. Dmitry Sklyarov was arrested in the US, and jailed for some months, for alleged violations of the DMCA which occurred in Russia, where the work for which he was arrested and charged was legal.