Block cipher
This article may be deleted soon.  

In cryptography, block ciphers are one of the two main types of symmetric cipher; they operate on fixedsize blocks of plaintext, giving a block of ciphertext for each. The other main type are stream ciphers, which generate a continuous stream of keying material to be mixed with messages. The basic function of block ciphers is to keep messages or stored data secret; the intent is that an unauthorised person be completely unable to read the enciphered material. Block ciphers therefore use a key and are designed to be hard to read without that key. Of course an attacker's intent is exactly the opposite; he wants to read the material without authorisation, and often without the key. See cryptanalysis for his methods. Among the bestknown and most widely used block ciphers are two US government standards. The Data Encryption Standard (DES) from the 1970s is now considered obsolete; the Advanced Encryption Standard (AES) replaced it in 2002. To choose the new standard, the National Institute of Standards and Technology ran an AES competition. Fifteen ciphers were entered, five finalists selected, and eventually AES chosen. Text below gives an overview; for details of the process and the criteria, and descriptions of all fifteen candidates, see the AES competition article. These standards greatly influenced the design of other block ciphers, and the latter part of this article is divided into sections based on that. DES and alternatives describes 20th century block ciphers, all with the 64bit block size of DES. The AES generation describes the next generation, the first 21st century ciphers, all with the 128bit block size of AES. Largeblock ciphers covers a few special cases that do not fit in the other sections. ContextBlock ciphers are essential components in many security systems. However, just having a good block cipher does not give you security, much as just having good tires does not give you transportation. It may not even help; good tires are of little use if you need a boat. Even in systems where block ciphers are needed, they are never the whole story. This section gives an overview of the rest of the story; it aims to provide a context for the rest of the article by mentioning some issues that, while outside the study of the ciphers themselves, are crucially important in understanding and using these ciphers. Any cipher is worthless without a good key. Keys must be kept secure, they should be large enough and sufficiently random that searching for the key (a brute force attack) is effectively impossible, and in any application which encrypts large volumes of data, the key must be changed from time to time. See the cryptography article for discussion. It is hard to design any system that must withstand adversaries; see cryptography is difficult. In particular, block ciphers must withstand cryptanalysis; it is impossible to design a good block cipher, or to evaluate the security of one, without a thorough understanding of the available attack methods. Also, Kerckhoffs' Principle applies to block ciphers; no cipher can be considered secure unless it can resist an attacker who knows all its details except the key in use. Analysis of security claims cannot even begin until all internal details of a cipher are published, so anyone making security claims without publishing those details will be either ignored or mocked by most experts. A block cipher defines how a single block is encrypted; a mode of operation defines how multiple block encryptions are combined to achieve some larger goal. Using a mode that is inappropriate for the application at hand may lead to insecurity, even if the cipher itself is secure. A block cipher can be used to build another cryptographic function such as a random number generator, a stream cipher, or a cryptographic hash. These are primarily a matter of choosing the correct mode, but there are more general design issues as well; see the linked articles for details. Block ciphers are often used as components in hybrid cryptosystems; these combine public key (asymmetric) cryptography with secret key (symmetric) techniques such as block ciphers or stream ciphers. Typically, the symmetric cipher is the workhorse that encrypts large amounts of data; the public key mechanism manages keys for the symmetric cipher and provides authentication. Generally other components such as cryptographic hashes and a cryptographically strong random number generator are required as well. Such a system can only be as strong as its weakest link, and it may not even be that strong. Using secure components including good block ciphers is certainly necessary, but just having good components does not guarantee that the system will be secure. See hybrid cryptosystem for how the components fit together, and information security for broader issues. That said, we turn to the block ciphers themselves. Size parametersOne could say there are only three things to worry about in designing a block cipher:
Getting adequate block size and key size is the easy part; just choose large enough numbers. This section describes how those choices are made. Making ciphers that resist attacks that are cleverer than brute force (see cryptanalysis) is far more difficult. The following section, Principles and techniques covers ideas and methods for that. Later on, we describe two generations of actual ciphers. The 20th century ciphers use 64bit blocks and key sizes from 56 bits up. The 21st century ciphers use 128bit blocks and 128bit or larger keys. If two or more ciphers use the same block and key sizes, they are effectively interchangeable. One can replace another in almost any application without requiring any other change to the application. This might be done to comply with a particular government's standards, to replace a cipher against which some new attack had been discovered, to provide efficiency in a particular environment, or simply to suit a preference. Nearly all cryptographic libraries give a developer a choice of components, and some protocols such as IPsec allow a network administrator to select ciphers. This may be a good idea if all the available ciphers are strong, but if some are weak it just gives the developer or administrator, neither of whom is likely to be an expert on ciphers, an opportunity to get it wrong. There is an argument that supporting multiple ciphers is an unnecessary complication. On the other hand, being able to change ciphers easily if one is broken provides a valuable safety mechanism. Striking some sort of balance with a few strong ciphers is probably the best policy. Block sizeThe block size of a cipher is chosen partly for implementation convenience; using a multiple of 32 bits makes software implementations simpler. However, it must also be large enough to guard against code book attacks. DES and the generation of ciphers that followed it all used a 64bit block size. To weaken such a cipher significantly the attacker must build up a code book with 2^{32} blocks, 32 gigabytes of data, all encrypted with the same key, As long as the cipher user changes keys reasonably often, a code book attack is not a threat. Procedures and protocols for block cipher usage therefore always include a rekeying policy. However, with Moore's Law making larger code books more practical, NIST chose to play it safe in their AES specifications; they used a 128bit block size. This was a somewhat controversial innovation at the time (1998), since it meant changes to a number of applications and it was not absolutely clear that the larger size was necessary. However, it has since become common practice; later ciphers such as Camellia, SEED and ARIA also use 128 bits. There are also a few ciphers which either support variable block size or have a large fixed block size. See the section on largeblock ciphers for details. Key sizeIn theory, any cipher except a onetime pad can be broken by a brute force attack; the enemy just has to try keys until he finds the right one. However, the attack is practical only if the cipher's key size is inadequate. If the key uses bits, there are 2^{n} possible keys and on average the attacker must test half of them, so the average cost of the attack is 2^{n1} encryptions. Current block ciphers all use at least 128bit keys, which makes brute force attacks utterly impractical. Suppose an attacker has a billion processors in a monster parallel machine (several orders of magnitude more than any current machine) and each processor can test a billion keys a second (also a generous estimate; if the clock is k GHz, the processor must do an encryption in k cycles to achieve this). This amazingly powerful attacker can test about 2^{60} keys a second, so he needs 2^{67} seconds against a 128bit key. There are about 2^{25} seconds in a year, so that is about 2^{42} years. This is over 4,000,000,000,000 (four trillion) years so the cipher is clearly secure against brute force. Many ciphers support larger keys as well; the reasons are discussed in the brute force attack article. Principles and techniquesThis section introduces the main principles of block cipher design, defines standard terms, and describes common techniques. All of the principles and many of the terms and techniques discussed here for block ciphers also apply to other cryptographic primitives such as stream ciphers and cryptographic hash algorithms. Iterated block ciphersNearly all block ciphers are iterated block ciphers; they have multiple rounds, each applying the same transformation to the output of the previous round. At setup time, a number of round keys or subkeys are computed from the primary key; the method used is called the cipher's key schedule. In the actual encryption or decryption, each round uses its own round key. This allows the designer to define some relatively simple transformation, called a round function, and apply it repeatedly to create a cipher with enough overall complexity to thwart attacks. Three common ways to design iterated block ciphers — SP networks, Feistel structures and the LaiMassey construction — and two important ways to look at the complexity requirements — avalanche and nonlinearity — are covered in following sections. Any iterated cipher can be made more secure by increasing the number of rounds or made faster by reducing the number. In choosing the number of rounds, the cipher designer tries to strike a balance that achieves both security and efficiency simultaneously. Often a safety margin is applied; if the cipher appears to be secure after a certain number of rounds, the designer specifies a somewhat larger number for actual use. There is a tradeoff that can be made in the design. With a simple fast round function, many rounds may be required to achieve adequate security; for example, GOST and TEA both use 32 rounds. A more complex round function might allow fewer rounds; for example, IDEA uses only 8 rounds. Since the ciphers with fast round functions generally need more rounds and the ones with few rounds generally need slower round functions, neither strategy is clearly better. Secure and reasonably efficient ciphers can be designed either way, and compromises are common. In cryptanalysis it is common to attack reduced round versions of a cipher. For example, in attacking a 16round cipher, the analyst might start by trying to break a tworound or fourround version. Such attacks are much easier. Success against the reduced round version may lead to insights that are useful in work against the full cipher, or even to an attack that can be extended to break the full cipher. Whitening and tweakingNearly all block ciphers use the same basic design, an iterated block cipher with multiple rounds. However, some have additional things outside that basic structure. Whitening involves mixing additional material derived from the key into the plaintext before the first round, or into the ciphertext after the last round, or both. The technique was introduced by Ron Rivest in DESX and has since been used in other ciphers such as RC6, Blowfish and Twofish. If the whitening material uses additional key bits, as in DESX, then this greatly increases resistance to brute force attacks because of the larger key. If the whitening material is derived from the primary key during key scheduling, then resistance to brute force is not increased since the primary key remains the same size. However, using whitening is generally much cheaper than adding a round, and it does increase resistance to other attacks; see papers cited for DESX. A recent development is the tweakable block cipher ^{[1]}. Where a normal block cipher has only two inputs, plaintext and key, a tweakable block cipher has a third input called the tweak. The tweak, along with the key, controls the operation of the cipher. Whitening can be seen as one form of tweaking, but many others are possible. If changing tweaks is sufficiently lightweight, compared to the key scheduling operation which is often fairly expensive, then some new modes of operation become possible. Unlike the key, the tweak need not always be secret, though it should be somewhat random and in some applications it should change from block to block. Tweakable ciphers and the associated modes are an active area of current research. The Hasty Pudding Cipher was one of the first tweakable ciphers, predating the Tweakable Block Ciphers paper and referring to what would now be called the tweak as "spice". AvalancheThe designer wants changes to quickly propagate through the cipher. This was named the avalanche effect in a paper ^{[2]} by Horst Feistel. The idea is that changes should build up like an avalanche, so that a tiny initial change (consider a snowball tossed onto a mountain) quickly creates large effects. The term and its exact application were new, but the basic concept was not; avalanche is a variant of Claude Shannon's diffusion, and that in turn is a formalisation of ideas that were already in use. If a single bit of input or of the round key is changed at round , that should affect all bits of the ciphertext by round for some reasonably small . Ideally, would be 1, but this is not generally achieved in practice. Certainly must be much less than the total number of rounds; if is large, then the cipher will need more rounds to be secure. The strict avalanche criterion ^{[3]} is a strong version of the requirement for good avalanche properties. Complementing any single bit of the input or the key should give exactly a 50% chance of a change in any given bit of output. Cipher structuresIn Claude Shannon's ^{[4]} terms, a cipher needs both confusion and diffusion, and a general design principle is that of the product cipher which combines several operations to achieve both goals. This goes back to the combination of substitution and transposition in various classical ciphers from before the advent of computers. All modern block ciphers are product ciphers. Two structures are very commonly used in building block ciphers — SP networks and the Feistel structure. The LaiMassey construction is a third alternative, less common than the other two. In Shannon's terms, all of these are product ciphers. Any of these structures is a known quantity for a cipher designer, part of the toolkit. He or she gets big chunks of a design — an overall cipher structure with a welldefined hole for the round function to fit into — from the structure, This leaves him or her free to concentrate on the hard part, designing the actual round function. None of these structures gives ideal avalanche in a single round but, with any reasonable round function, all give excellent avalanche after a few rounds. Not all block ciphers use one of these structures, but most do. This section describes these common structures. SP networksA substitutionpermutation network or SP network or SPN is Shannon's own design for a product cipher. It uses two layers in each round: a substitution layer provides confusion, then a permutation layer provides diffusion. The Slayer typically uses lookup tables called substitution boxes or Sboxes, though other mechanisms are also possible. The input is XORed with a round key, split into parts and each part used as an index into an Sbox. The Sbox output then replaces that part so the combined Sbox outputs become the Slayer output. Sboxes are discussed in more detail in their own section below. The Player permutes the resulting bits, providing diffusion or in Feistel's terms helping to ensure avalanche. A single round of an SP network does not provide ideal avalanche; output bits are affected only by inputs to their Sbox, not by all input bits. However, the Player ensures that the output of one Sbox in one round will affect several Sboxes in the next round so, after a few rounds, overall avalanche properties can be very good. Feistel structureAnother way to build an iterated block cipher is to use the Feistel structure. This technique was devised by Horst Feistel of IBM and used in DES. Such ciphers are known as Feistel ciphers or Feistel networks. In Shannon's terms, they are another class of product cipher. Feistel ciphers are sometimes referred to as LubyRackoff ciphers after the authors of a theoretical paper ^{[5]} analyzing some of their properties. Later work ^{[6]} based on that shows that a Feistel cipher with seven rounds can be secure. In a Feistel cipher, each round uses an operation called the Ffunction whose input is half a block and a round key; the output is a halfblock of scrambled data which is XORed into the other halfblock of text. The rounds alternate direction — in one data from the left halfblock is input and the right halfblock is changed, and in the next round that is reversed. Showing the halfblocks as left and right, bitwise XOR as (each bit of the output word is the XOR of the corresponding bits of the two input words) and round key for round as k_{n}, even numbered rounds are then: and oddnumbered rounds are Since XOR is its own inverse (abb=a for any a,b) and the halfblock that is used as input to the Ffunction is unchanged in each round, reversing a Feistel round is straightforward. Just calculate the Ffunction again with the same inputs and XOR the result into the ciphertext to cancel out the previous XOR. For example, the decryption step matching the first example above is: In some ciphers, including those based on SP networks, all operations must be reversible so that decryption can work. The main advantage of a Feistel cipher over an SP network is that the Ffunction itself need not be reversible, only repeatable. This gives the designer extra flexibility; almost any operation he can think up can be used in the Ffunction. On the other hand, in the Feistel construction, only half the output changes in each round while an SP network changes all of it in a single round. A single round in a Feistel cipher has less than ideal avalanche properties; only half the output is changed. However, the other half is changed in the next round so, with a good Ffunction, a Feistel cipher can have excellent overall avalanche properties within a few rounds. It is possible to design a Feistel cipher so that the Ffunction itself has ideal avalanche properties — every output bit depends nonlinearly on every input bit and every key bit — details are in a later section. There is a variant called an unbalanced Feistel cipher in which the block is split into two unequalsized pieces rather than two equal halves. Skipjack was a wellknown example. There are also variations which treat the text as four blocks rather than just two; MARS and CAST256 are examples. The hard part of Feistel cipher design is of course the Ffunction. Design goals include efficiency, easy implementation, and good avalanche properties. Also, it is critically important that the Ffunction be highly nonlinear. All other operations in a Feistel cipher are linear and a cipher without enough nonlinearity is weak; see below. LaiMassey schemeThis structure was introduced in a thesis by Xuejia Lai, supervised by James Massey, in a cipher which later became the International Data Encryption Algorithm, IDEA.^{[7]} It has since been used in other ciphers such as FOX, later renamed IDEA NXT. Perhaps the bestknown analysis is by Serge Vaudenay, one of the designers of FOX. ^{[8]} One paper ^{[9]} proposes a general class of "quasiFeistel networks", with the LaiMassey scheme as one instance, and shows that several of the wellknown results on Feistel networks (such as the LubyRackoff and Patarin papers referenced above) can be generalised to the whole class. Another ^{[10]} gives some specific results for the LaiMassey scheme. NonlinearityTo be secure, every cipher must contain nonlinear operations. If all operations in a cipher were linear then the cipher could be reduced to a system of linear equations and be broken by an algebraic attack. The attacker can choose which algebraic system to use; for example, against one cipher he might treat the text as a vector of bits and use Boolean algebra while for another he might choose to treat it as a vector of bytes and use arithmetic modulo 2^{8}. The attacker can also try linear cryptanalysis. If he can find a good enough linear approximation for the round function and has enough known plaintext/ciphertext pairs, then this will break the cipher. Defining "enough" in the two places where it occurs in the previous sentence is tricky; see linear cryptanalysis. What makes these attacks impractical is a combination of the sheer size of the system of equations used (large block size, whitening, and more rounds all increase this) and nonlinearity in the relations involved. In any algebra, solving a system of linear equations is moreorless straightforward provided there are more equations than variables. However, solving nonlinear systems of equations is far harder, so the cipher designer strives to introduce nonlinearity to the system, preferably to have at least some components that are not even close to linear. Combined with good avalanche properties and enough rounds, this makes both direct algebraic analysis and linear cryptanalysis prohibitively difficult. There are several ways to add nonlinearity; some ciphers rely on only one while others use several. One method is mixing operations from different algebras. If the cipher relies only on Boolean operations, the cryptanalyst can try to attack using Boolean algebra; if it uses only arithmetic operations, he can try normal algebra. If it uses both, he has a problem. Of course arithmetic operations can be expressed in Boolean algebra or vice versa, but the expressions are inconveniently (for the cryptanalyst!) complex and nonlinear whichever way he tries it. For example, in the Blowfish Ffunction, it is necessary to combine four 32bit words into one. This is not done with just addition, x = a+b+c+d or just Boolean operations x = abcd but instead with a mixture, x = ((a+b)c)+d. On most computers this costs no more, but it makes the analyst's job harder. Other operations can also be used, albeit at higher costs. IDEA uses multiplication modulo 2^{16}+1 and AES does matrix multiplications with polynomials in a Galois field. Rotations, also called circular shifts, on words or registers are nonlinear in normal algebra, though they are easily described in Boolean algebra. GOST uses rotations by a constant amount, CAST128 and CAST256 use a keydependent rotation in the Ffunction, and RC5, RC6 and MARS all use datadependent rotations. A general operation for introducing nonlinearity is the substitution box or Sbox; see following section. Nonlinearity is also an important consideration in the design of stream ciphers and cryptographic hash algorithms. For hashes, much of the mathematics and many of the techniques used are similar to those for block ciphers. For stream ciphers, rather different mathematics and methods apply (see BerlekampMassey algorithm for example), but the basic principle is the same. SboxesSboxes or substitution boxes are lookup tables. The basic operation involved is a = sbox[b] which, at least for reasonable sizes of a and b, is easily done on any computer. Sboxes are described as by , with representing the number of input bits and the number of output bits. For example, DES uses 6 by 4 Sboxes. The storage requirement for an by Sbox is 2^{m}n bits, so large values of (many input bits) are problematic. Values up to eight are common and MARS has a 9 by 32 Sbox; going much beyond that would be expensive. Large values of (many output bits) are not a problem; 32 is common and at least one system, the Tiger hash algorithm ^{[11]}, uses 64. Sboxes are often used in the Slayer of an SP Network. In this application, the Sbox must have an inverse to be used in decryption. It must therefore have the same number of bits for input and output; only by Sboxes can be used. For example, AES is an SP network with a single 8 by 8 Sbox and Serpent is one with eight 4 by 4 Sboxes. Another common application is in the Ffunction of a Feistel cipher. Since the Ffunction need not be reversible, there is no need to construct an inverse Sbox for decryption and Sboxes of any size may be used. With either an SP network or a Feistel construction, nonlinear Sboxes and enough rounds give a highly nonlinear cipher. Large SboxesThe first generation of Feistel ciphers used relatively small Sboxes, 6 by 4 for DES and 4 by 4 for GOST. In these ciphers the Ffunction is essentially one round of an SP Network. The eight Sboxes give 32 bits of Sbox output. Those bits, reordered by a simple transformation, become the 32bit output of the Ffunction. Avalanche properties are less than ideal since each output bit depends only on the inputs to one Sbox. The output transformation (a bit permutation in DES, a rotation in GOST) compensates for this, ensuring that the output from one Sbox in one round affects several Sboxes in the next round so that good avalanche is achieved after a few rounds. Later Feistel ciphers use larger Sboxes; CAST128 or CAST256 and Blowfish all use four 8 by 32 Sboxes. They do not use Sbox bits directly as Ffunction output. Instead, they take a 32bit word from each Sbox, then combine them to form a 32bit output. This gives an Ffunction with ideal avalanche properties; every bit of input and every bit of round key affects every bit of output. No output transformation is required in such an Ffunction, and Blowfish has none. However, one may be used anyway; the CAST ciphers add a keydependent rotation. With the Feistel structure and such an Ffunction, complete avalanche — all 64 output bits depend on all 64 input bits — is achieved in three rounds, and the cipher meets the strict avalanche criterion ^{[3]}; complementing any input bit has a 50% chance of changing any given output bit. These ciphers are primarily designed for software implementation, rather than the 1970s hardware DES was designed for, so looking up a full computer word at a time makes sense. An 8 by 32 Sbox takes one K byte of storage; several can be used on a modern machine without difficulty. They need only four Sbox lookups, rather than the eight in DES or GOST, so the Ffunction and therefore the whole cipher can be reasonably efficient. Sbox designThere is an extensive literature on the design of good Sboxes, much of it emphasizing achieving high nonlinearity though other criteria are also used. See external links. The CAST Sboxes use bent functions (the most highly nonlinear Boolean functions) as their columns. That is, the mapping from all the input bits to any single output bit is a bent function. A paper on generating the Sboxes is Mister & Adams "Practical Sbox Design" ^{[12]}. Bent functions are combined to get additional desirable traits — a balanced Sbox (equal probability of 0 and 1 output), minimum correlation among output bits, and high overall Sbox nonlinearity. Blowfish uses a different approach, generating random Sboxes as part of the key scheduling operation at cipher setup time. Such Sboxes are not as nonlinear as the carefully constructed CAST ones, but they are nonlinear enough and, unlike the CAST Sboxes, they are unknown to an attacker. In perfectly nonlinear Sboxes ^{[13]}, not only are all columns bent functions (the most nonlinear possible Boolean functions), but all linear combinations of columns are bent functions as well. This is possible only if , there are at least twice as many input bits as output bits. Such Sboxes are therefore not much used. Sboxes in analysisSboxes are sometimes used as an analytic tool even for operations that are not actually implemented as Sboxes. Any operation whose output is fully determined by its inputs can be described by an Sbox; concatenate all inputs into an index, look that index up, get the output. For example, the IDEA cipher uses a multiplication operation with two 16bit inputs and one 16bit output; it can be modeled as a 32 by 16 Sbox. In an academic paper, one might use such a model in order to apply standard tools for measuring Sbox nonlinearity. A wellfunded cryptanalyst might actually build the Sbox (8 gigabytes of memory) either to use in his analysis or to speed up an attack. Resisting linear & differential attacksTwo very powerful cryptanalytic methods of attacking block ciphers are linear cryptanalysis and differential cryptanalysis. The former works by finding linear approximations for the nonlinear components of a cipher, then combining them using the pilingup lemma to attack the whole cipher. The latter looks at how small changes in the input affect the output, and how such changes propagate through multiple rounds. These are the only known attacks that break DES with less effort than brute force, and they are completely general attacks that apply to any block cipher.. Both these attacks, however, require large numbers of known or chosen plaintexts, so a simple defense against them is to rekey often enough that the enemy cannot collect sufficient texts. Techniques introduced for CAST go further, building a cipher that is provably immune to linear or differential analysis with any number of texts. The method, taking linear cryptanalysis as our example and abbreviating it LC, is as follows:
A similar approach applied to differentials gives values for that make differential cryptanalysis impractical or impossible. Choose the actual number of rounds so that, at a minimum, both attacks are impractical. Ideally, make both impossible, then add a safety factor. There are other methods of constructing ciphers provably immune to these attacks. Serge Vaudenay's work on decorrelation theory gives one ^{[14]} and KnudsenNyberg ciphers are another. ^{[15]} This type of analysis is now a standard part of the cryptographer's toolkit. Many of the AES candidates, for example, included proofs along these lines in their design documentation, and AES itself uses such a calculation to determine the number of rounds required for various key sizes. DES and alternativesThe Data Encryption Standard, DES, is among the best known and most thoroughly analysed block ciphers. It was invented by IBM and was made a US government standard, for nonclassified government data and for regulated industries such as banking, in the late 70s. From then until about the turn of the century, it was very widely used. It is now considered obsolete because its 56bit key is too short to resist brute force attacks if the opponents have recent technology. The DES standard marked the beginning of an era in cryptography. Of course, much work continued to be done in secret by military and intelligence organisations of various nations, but from the time of DES cryptography also developed as an open academic discipline complete with journals, conferences, courses and textbooks. In particular, there was a lot of work related to block ciphers. For an entire generation, every student of cryptanalysis tried to find a way to break DES and every student of cryptography tried to devise a cipher that was demonstrably better than DES. Very few succeeded. Every new cryptanalytic technique invented since DES became a standard has been tested against DES. None of them have broken it completely, but two — differential cryptanalysis and linear cryptanalysis — give attacks theoretically significantly better than brute force. This does not appear to have much practical importance since both require enormous numbers of known or chosen plaintexts, all encrypted with the same key, so reasonably frequent key changes provide an effective defense. All the older publicly known cryptanalytic techniques have also been tried, or at least considered, for use against DES; none of them work. DES served as a sort of baseline for cipher design through the 80s and 90s; the design goal for almost any 20th century block cipher was to replace DES in some of its many applications with something faster, more secure, or both. All these ciphers used 64bit blocks, like DES, but most used 128bit or longer keys for better resistance to brute force attacks. Ciphers of this generation include:
Many of the techniques used in these ciphers came from DES and many of the design principles came from analysis of DES. However, there were also new design ideas. The CAST ciphers were the first to use large Sboxes which allow the Ffunction of a Feistel cipher to have ideal avalanche properties, and to use bent functions in the Sbox columns. Blowfish introduced keydependent Sboxes. Several introduced new ways to achieve nonlinearity: datadependent rotations in RC5, keydependent rotations in CAST128, a clever variant on multiplication in IDEA, and the pseudoHadamard transform in SAFER. The era effectively ended when the US government began working on a new cipher standard to replace their Data Encryption Standard, the Advanced Encryption Standard or AES. A whole new generation of ciphers arose, the first 21st century block ciphers. Of course these designs still drew on the experience gained in the postDES generation, but overall these ciphers are quite different. In particular, they all use 128bit blocks and most support key sizes up to 256 bits. The AES generationBy the 90s, the Data Encryption Standard was clearly obsolete; its small key size made it more and more vulnerable to brute force attacks as computers became faster. The US National Institute of Standards and Technology (NIST) therefore began work on an Advanced Encryption Standard, AES, a block cipher to replace DES in government applications and in regulated industries. To do this, they ran a very open international AES competition, starting in 1998. Their requirements specified a block cipher with 128bit block size and support for 128, 192 or 256bit key sizes. Evaluation criteria included security, performance on a range of platforms from 8bit CPUs (e.g. in smart cards) up, and ease of implementation in both software and hardware. Fifteen submissions meeting basic criteria were received. All were iterated block ciphers; in Shannon's terms all were product ciphers. Most used an SP network or Feistel structure, or variations of those. Several had proofs of resistance to various attacks. The AES competition article covers all candidates and many have their own articles as well. Here we give only a summary. After much analysis and testing, and two conferences, the field was narrowed to five finalists:
After another year of analysis and testing, they chose a winner. In October 2002, Rijndael was chosen to become the Advanced Encryption Standard or AES. See external links for the official standard. An entire generation of block ciphers used the 64bit block size of DES, but since AES many new designs use a 128bit block size. As discussed under size parameters, if two or more ciphers have the same block and key sizes, then they are effectively interchangeable; replacing one cipher with another requires no other changes in an application. When asked to implement AES, the implementer might include the other finalists — Twofish, Serpent. RC6 and MARS — as well. This provides useful insurance against the (presumably unlikely) risk of someone finding a good attack on AES. Little extra effort is required since open source implementations of all these ciphers are readily available, see external links. All except RC6 have completely open licenses. There are also many other ciphers that might be used. There were ten AES candidates that did not make it into the finals:
Some should not be considered. Magenta and FROG have been broken, DEAL is slow, and E2 has been replaced by its descendant Camellia. There are also some newer 128bit ciphers that are widely used in certain countries:
Largeblock ciphersFor most applications a 64bit or 128bit block size is a fine choice; nearly all common block ciphers use one or the other. Such ciphers can be used to encrypt objects larger than their block size; just choose an appropriate mode of operation. For nearly all ciphers, the block size is a power of two. Joan Daemen's PhD thesis, though, had two exceptions: 3Way uses 96 bits (three 32bit words) and BaseKing 192 (three 64bit words). Neither cipher was widely used, but they did influence later designs. Daemen was one of the designers of Square and of Rijndael which became the Advanced Encryption Standard. A few ciphers supporting larger block sizes do exist; this section discusses them. A block cipher with larger blocks may be more efficient; it takes fewer block operations to encrypt a given amount of data. It may also be more secure in some ways; diffusion takes place across a larger block size, so data is more thoroughly mixed, and large blocks make a code book attack more difficult. On the other hand, great care must be taken to ensure adequate diffusion within a block so a largeblock cipher may need more rounds, larger blocks require more padding, and there is not a great deal of literature on designing and attacking such ciphers so it is hard to know if one is secure. Largeblock ciphers are inconvenient for some applications and simply do not fit in some protocols. Some block ciphers, such as Block TEA and Hasty Pudding, support variable block sizes. They may therefore be both efficient and convenient in applications that need to encrypt many items of a fixed size, for example disk blocks or database records. However, just using the cipher in ECB mode to encrypt each block under the same key is unwise, especially if encrypting many objects. With ECB mode, identical blocks will encrypt to the same ciphertext and give the enemy some information. One solution is to use a tweakable cipher such as Hasty Pudding with the block number or other object identifier as the tweak. Another is to use CBC mode with an initialisation vector derived from an object identifier. Cryptographic hash algorithms can be built using a block cipher as a component. There are generalpurpose methods for this that can use existing block ciphers; Applied Cryptography ^{[16]} gives a long list and describes weaknesses in many of them. However, some hashes include a specificpurpose block cipher as part of the hash design. One example is Whirlpool, a 512bit hash using a block cipher similar in design to AES but with 512bit blocks and a 512bit key. Another is the Advanced Hash Standard candidate Skein which uses a tweakable block cipher called Threefish. Threefish has 256bit, 512bit and 1024bit versions; in each version block size and key size are both that number of bits. It is possible to go the other way and use any cryptographic hash to build a block cipher; again Applied Cryptography ^{[16]} has a list of techniques and describes weaknesses. The simplest method is to make a Feistel cipher with double the hash's block size; the Ffunction is then just to hash text and round key together. This technique is rarely used, partly because a hash makes a rather expensive round function and partly because the block cipher block size would have to be inconveniently large; for example using a 160bit bit hash such as SHA1 would give a 320bit block cipher. The hashtocipher technique was, however, important in one legal proceeding, the Bernstein case. At the time, US law strictly controlled export of cryptography because of its possible military uses, but hash functions were allowed because they are designed to provide authentication rather than secrecy. Bernstein's code built a block cipher from a hash, effectively circumventing those regulations. Moreover, he sued the government over his right to publish his work, claiming the export regulations were an unconstitutional restriction on freedom of speech. The courts agreed, effectively striking down the export controls. It is also possible to use a public key operation as a block cipher. For example, one might use the RSA algorithm with 1024bit keys as a block cipher with 1024bit blocks. Since the round function is itself cryptographically secure, only one round is needed. However, this is rarely done; public key techniques are expensive so this would give a very slow block cipher. A much more common practice is to use public key methods, block ciphers, and cryptographic hashes together in a hybrid cryptosystem. References
