Citizendium - a community developing a quality comprehensive compendium of knowledge, online and free. Click here to join and contribute—free
CZ thanks our previous donors. Donate here. Treasurer's Financial Report -- Thanks to our content contributors. --

Politics of cryptography

From Citizendium, the Citizens' Compendium

Jump to: navigation, search
This article is developing and not approved.
Main Article
Talk
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and not meant to be cited; by editing it you can help to improve it towards a future approved, citable version. These unapproved articles are subject to a disclaimer.

Contents

The very nature of cryptography, or hiding a message by converting it to an unreadable jumble of apparently random symbols, evokes a number of legal and political issues. Cryptography is of considerable interest to civil rights supporters, mostly for its ability to facilitate privacy and free speech. However, because of its potential to assist the malicious in their schemes to avoid legal responsibilities, cryptography has long been of interest to intelligence gathering and law enforcement agencies as well. Accordingly, there is a history of legal and political controversy surrounding cryptography, especially since the advent of inexpensive computers made widespread access to high quality cryptography possible.

The 1990s had the "crypto wars"; intense debates on cryptography policy issues. [1] The main advocates of tight control were various governments, especially the US government. Their opponents included most of the main hi-tech companies and a loose coalition of radicals called cypherpunks. The governments lost those battles; however they have been attempting to recover lost ground in the new century.

Encryption used for Digital Rights Management also sets off passionate debates and raises legal issues. Should it be illegal for a user to defeat the encryption on a DVD? Or for a movie cartel to manipulate the market using encryption in an attempt to force Australians to pay higher prices because US DVDs will not play on their machines?

The legal status of digital signatures can be an issue, and cryptographic techniques may affect the acceptability of computer data as evidence. Access to data can be an issue: can a warrant or a tax auditor force someone to decrypt data, or even to turn over the key? Is encrypting a laptop hard drive when traveling just a reasonable precaution, or is it reason for a border control officer to become suspicious?

Active advocacy groups in this area include the Electronic Frontier Foundation or EFF [1], Electronic Privacy Information Center or EPIC [2] and Global Internet Liberty Campaign or GILC [3].

Two online surveys cover crypto laws around the world, one for usage and export restrictions and one for digital signatures.

Prohibitions

In some countries, even the domestic use of cryptography is, or has been, restricted. Until 1999, France significantly restricted the use of cryptography domestically. In China, a license is still required to use cryptography. Many countries have tight restrictions on the use of cryptography. Among the more restrictive are laws in Belarus, China, Kazakhstan, Mongolia, Pakistan, Russia, Singapore, Tunisia, Venezuela, and Vietnam[2].

In the United States and most other Western countries, cryptography is legal for domestic use, but there has been much conflict over legal issues related to cryptography. One particularly important issue has been the export of cryptography and cryptographic software and hardware. See the next section.

In the United Kingdom, cryptographic technology has been subject to one major controversial legal challenge - the Regulation of Investigatory Powers Act (often referred to as "RIPA" or "RIP Act"). Part III of the Act allows for police and law enforcement agencies to serve a notice requiring a person or business to hand over encryption keys or passwords. Not handing over encryption keys on demand has a criminal sanction with a maximum penalty of two years imprisonment. Critics point out that this doesn't work against some cryptography systems that have in-built plausible deniability like TrueCrypt.

Export Controls

For more information, see: Export controls.

Because of the importance of cryptanalysis in World War II (in particular, see ULTRA) and an expectation that cryptography would continue to be important for national security, many western governments have, at some point, strictly regulated export of cryptography. After World War II, it was illegal in the US to sell or distribute encryption technology overseas; in fact, encryption was classified as a munition, like tanks and nuclear weapons[3]. Until the advent of the personal computer and the Internet, this was not especially problematic as good cryptography was indistinguishable from bad cryptography for nearly all users, and because most of the cryptographic techniques generally available were slow and error prone whether good or bad. However, as the Internet grew and computers became more widely available, high quality encryption techniques became well-known around the globe. As a result, export controls came to be understood to be an impediment to commerce and to research.

In 1996, thirty-nine countries signed the Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies, an arms control treaty that deals with the export of arms and "dual-use" (civilian and military) technologies such as cryptography. The treaty stipulated that the use of cryptography with short key-lengths (56-bit for symmetric encryption, 512-bit for RSA) would no longer be export-controlled[4].

In the 1990s, several challenges were launched against US regulations for export of cryptography. Philip Zimmermann's Pretty Good Privacy (PGP) encryption program, as well as its source code, was released in the US, and found its way onto the Internet in June of 1991. After a complaint by RSA Security (then called RSA Data Security, Inc., or RSADSI), Zimmermann was criminally investigated by the Customs Service and the FBI for several years but no charges were filed[5]. The creator of the software, Phil Zimmerman, was subject to investigation and a Senate inquiry. The fact that Zimmerman could distribute PGP in the United States, but was banned from exporting it, when exporting the software was made particularly simple thanks to the Internet, was considered by many digital rights advocates to be bizarre and any attempt to enforce the law quixotic.

Zimmerman's PGP was not the only software that was affected by export control regulations. Early Internet browsers like Netscape Navigator would be distributed in two versions – one for the United States with strong encryption enabled, and another for the rest of the world with only weak encryption. Once the Netscape source code was made available under an open source license, programmers outside of the U.S. started distributing versions modified to contain a reimplementation of the strong encryption. Some open source operating systems also distributed strong encryption and just asked users to not install components which they had not exported legally. The FreeS/WAN project deliberately did all their work outside the US to avoid American export restrictions.

Bernstein case

Also, Daniel Bernstein, then a graduate student at University of California at Berkeley, brought a lawsuit against the US government challenging the export restrictions as unconstitutional. The argument is that code is speech, programmers use it to communicate with each other as well as with the computer, so it is entitled to the free speech protection of the First Amendment to the U.S. Constitution. The 1995 case was Bernstein v. United States, and was supported by EFF. Bernstein won both in the trial court and on appeal [6].

In December 1996 the U.S. District Judge Marilyn Patel ruled in favor of Bernstein, in effect striking down the State Department export regulations. The decision was somewhat moot, however, because a few weeks later the government transferred regulation of crypto export controls from the Department of State to the Department of Commerce. EFF renewed the suit against Commerce, arguing that the jurisdictional transfer did not change the underlying issues. In August 1997, Judge Patel once again ruled in favor of Bernstein: "The court declares that the Export Administration Regulations . . . insofar as they apply to or require licensing for encryption and decryption software and related devices and technology, are in violation of the First Amendment on the grounds of prior restraint and are, therefore, unconstitutional as discussed above, and shall not be applied to plaintiff's publishing of such items, including scientific papers, algorithms or computer programs." Although this decision technically strikes down the export control laws, the government filed an emergency request, and Patel agreed to stay her order until the Appeals Court could rule on her decision. The case was heard by the 9th Circuit Court of Appeals in San Francisco in December, 1997. In May 1999, the Court ruled 2-1, upholding Judge Patel's decision.[7]

The courts ruled that source code for cryptographic algorithms and systems was protected as free speech by the Constitution. There is an archive of case documents.

Relaxation of regulations

Cryptography exports from the US are now much less strictly regulated than in the past as a consequence of a major relaxation in 2000[2]. This regulation change was seen by many on the other side as a government attempt to get off the hook in the Bernstein case, to avoid losing in the Supreme Court and having their regulations completely overturned. There are no longer many restrictions on key sizes in US-exported mass-market software or open source software. However, notifying the Bureau of Export Administration is still required; the EFF says that requirement is unconstitutional as well.

In practice today, since the relaxation in US export restrictions, and because almost every personal computer connected to the Internet, everywhere in the world, includes a US-sourced web browser such as Mozilla Firefox or Microsoft Internet Explorer, almost every Internet user worldwide has strong cryptography in their browser's Transport Layer Security or Secure Sockets Layer stack. The Mozilla Thunderbird and Microsoft Outlook E-mail client programs similarly can connect to Internet Message Access Protocol or Post Office Protocol(POP) servers via TLS, and can send and receive email encrypted with S/MIME.

Many Internet users don't realize that their basic application software contains such extensive cryptography systems. These browsers and email programs are so ubiquitous that even governments whose intent is to regulate civilian use of cryptography generally don't find it practical to do much to control distribution or use of this quality of cryptography, so even when such laws are in force, actual enforcement is often lax.

NSA involvement

Another contentious issue connected to cryptography in the United States, is influence of the National Security Agency in high quality cipher development and policy. NSA, or any of the similar agencies in other countries, has two conflicting goals. They are responsible for the protection of their own nation's communications, which requires strong ciphers. On the other hand, they are also responsible for listening in on rival governments, and sometimes domestic spies, criminals or terrorists; this gives them a motive for promoting weak ciphers. To a paranoid, anything coming from such an agency must be treated as suspect.

NSA was involved with the design of DES during its development at IBM and its consideration by the National Bureau of Standards as a possible Federal Standard for cryptography[8]. DES was designed to be secure against differential cryptanalysis[9], a powerful and general cryptanalytic technique known to NSA and IBM, that became publicly known only when it was rediscovered in the late 1980s[10]. According to Steven Levy, IBM discovered differential cryptanalysis[11] and kept the technique secret at NSA's request.

Escrowed encryption

Another instance of NSA's involvement was the 1993 Clipper chip affair, an escrowed encryption microchip intended to be part of the Capstone cryptography-control initiative. This implemented cryptography designed to be secure against almost everyone, but breakable at need by government agents. Keys were to be kept in escrow, but obtainable when required for law enforcement or national security wiretaps[1].

Clipper was widely criticized for cryptographic reasons. The cipher algorithm was classified, which violates Kerckhoffs' Principle; no cipher should be trusted until it has been published and subjected to independent analysis. Some were willing to ignore this — after all, it seems fairly safe to assume the NSA are competent and that their internal cipher-cracking experts would have analyzed the cipher. However, some expressed concerns that NSA might have deliberately made the cipher weak in order to assist its intelligence efforts.

The cipher, called Skipjack, was declassified in 1998 after the Clipper initiative lapsed.

Matt Blaze showed in 1994 [12] that the protocol was flawed and easily subverted.

CALEA

The Communications Assistance to Law Enforcement Act of 1994, also referred to as CALEA or the Digital Telephony Act, requires that all telephone switching equipment sold in the US provide wiretap capability. Of course this has been a fairly controversial measure.

More recently, there has been debate over whether and how CALEA applies to the Internet, in particular to Voice over IP services. After extensive discussion, the IETF issued RFC 2804, IETF Policy on Wiretapping which says they have no intention of building monitoring requirements into Internet standards. However, it also points out that unencrypted Internet communications are easily monitored without the need for any new facilities.

Echelon

Echelon [13] is a joint monitor-the-world project involving the signals intelligence agencies of five English-speaking democracies — the US National Security Agency (NSA), British Government Communications Headquarters (GCHQ), Canadian Communications Security Establishment, Australia's Defence Signals Directorate (DSD) and New Zealand's Government Communications Security Bureau (GCSB). Each country maintains some monitoring stations on its territory or in overseas bases and has access to data gathered by all the others. Echelon monitors more-or-less everything that goes via communications satellites, which is much of the world's Internet and overseas telephone traffic, and various other things as well. Powerful computers and sophisticated pattern-matching are used to look for useful intelligence amid the massive data.

The US has been accused [4] of using Echelon data to help US companies win foreign contracts, for example to help Boeing beat out Airbus. A formal CIA director did not admit Echelon was involved, but did admit US government economic espionage. He claimed it was necessary because various non-US companies were using bribery and kickbacks to get contracts.

Digital rights management

For more information, see: Digital Rights Management.


Cryptography is central to Digital Rights Management (DRM), a group of techniques for technologically controlling use of copyrighted material, being widely implemented and deployed at the behest of some copyright holders.

Digital Millennium Copyright Act

In 1998, Bill Clinton signed the Digital Millennium Copyright Act (DMCA), which criminalized the production, dissemination, and use of certain cryptanalytic techniques and technology; specifically, those that could be used to circumvent Digital Rights Management schemes[14]. This had a very serious potential impact on the cryptography research community since an argument can be made that virtually any cryptanalytic research violated, or might violate, the DMCA. The FBI has not enforced the DMCA as rigorously as had been feared by some, but the law, nonetheless, remains a controversial one. One well-respected cryptography researcher, Niels Ferguson, has publicly stated that he will not release some research into an Intel security design for fear of prosecution under the DMCA, and both Alan Cox [5] (longtime number two in Linux kernel development) and Professor Edward Felten (and some of his students at Princeton) have encountered problems related to the Act. Dmitry Sklyarov was arrested in the US, and jailed for some months, for alleged violations of the DMCA which occurred in Russia, where the work for which he was arrested and charged was legal.

The EFF have issued a series of reports on the effects of the DMCA. The current one is "Unintended Consequences: Ten Years under the DMCA" [15].

References

  1. 1.0 1.1 Steven Levy (2001). "Crypto: How the Code Rebels Beat the Government — Saving Privacy in the Digital Age. Penguin, 56. ISBN 0-14-024432-8. 
  2. 2.0 2.1 RSA Laboratories' Frequently Asked Questions About Today's Cryptography
  3. Cryptography & Speech from Cyberlaw
  4. The Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies
  5. "Case Closed on Zimmermann PGP Investigation", press note from the IEEE
  6. Bernstein v USDOJ, 9th Circuit court of appeals decision.
  7. Encryption and National Security, MIT Open Courseware
  8. "The Data Encryption Standard (DES)" from Bruce Schneier's CryptoGram newsletter, June 15, 2000
  9. Coppersmith, D. (May 1994). "The Data Encryption Standard (DES) and its strength against attacks" (PDF). IBM Journal of Research and Development 38 (3): 243.
  10. E. Biham and A. Shamir, "Differential cryptanalysis of DES-like cryptosystems", Journal of Cryptology, vol. 4 num. 1, pp. 3-72, Springer-Verlag, 1991.
  11. Levy, pg. 56
  12. . Protocol Failure in the Escrowed Encryption Standard.
  13. Nicky Hager (1996). Secret Power - New Zealand's Role in the International Spy Network. Craig Potton Publishing. 
  14. Digital Millennium Copyright Act
  15. EFF (2008). Unintended Consequences: Ten Years under the DMCA. Electronic Frontier Foundation.
Views
Personal tools