Random number generator

While defining true randomness can drift from computer science and mathematics into philosophy, Bruce Schneier has a reasonable criterion for a source of random numbers: "It cannot be reliably reproduced." This article focuses first on the idea of randomness, rather than on how the random or pseudo-random sequence is produced. For properly selected applications, it may be possible to be adequately random with a technique that does not depend on a true random physical process. In other cases, it may be practical to use a combination of physical and computational methods.

In any of these cases, wise implementers do not assume randomness or even adequate pseudo-randomness. Even when the generator is an apparently random physical process, the motto "trust, but verify" still holds because some physical phenomena may indeed be random in the short- or long-term, but the nature of the physical resource (e.g., the declining radioactivity of a source) may affect its properties. It may be possible to compensate for a weak area, but only if it is known. See.

A truly random one-time pad may be generated with a combination of measurement of a random physical phenomenon (e.g., thermal noise or radioactive disintegrations), which is then captured by a computer and put into high-density storage. An expert in the physical phenomenon being measured needs to be consulted to determine if postprocessing is needed.

There are a wide range of applications for random or pseudo-random numbers, with various degrees of randomness. A computer game can give an apparently different scenario whenever played, with some simple fairly random physical inputs, such as selected bits from the computer's time of day clock, and, perhaps, the time between the last several keystrokes or mouse movements. In other cases, such as extremely critical cryptography, only the best uncorrelated physical random sequences will do. See related articles for examples of applications.

Manual generation
In the past, one-time pads were generated by typists who were told to type randomly, but, in practice, had patterns. Limited to manual methods, a not-unreasonable way is to put numbered balls in a bowl, having a blindfolded person take it out, the number recorded, the ball returned to the bowl, and the bowl shaken thoroughly before taking the next ball.

This is a point mostly of historical interest, but typist patterns did cause weakness in some one-time pads. It is hard to imagine why manual generation would be useful today.

Random sequences from physical phenomena
Real-world physical phenomena often exhibit randomness, although some it is possible to be random in the short term, yet may exhibit some unexpected patterns or self-similarity. The number of cosmic rays striking an object at a given second is as random a process as is known. The number of disintegration events, in a radioactive material, is often considered random, but consider that if the amount of radioisotope is finite, as more and more half-lives pass, there will be fewer disintegrations. There is no simple answer to the question of whether a downward trend in disintegrations makes that sequence nonrandom.

There are, however, methods that can be used to postprocess such a source to remove bias and correlation.

One widespread technique is due to John von Neumann. Take the input bits in pairs, discard 11 or 00 pairs, output one if the input is 10 and zero if the input is 01. As long as the input bits are independent, this completely eliminates bias. For example, suppose the input is 90% ones. The chance of a 10 sequence is .9*.1 while that of a 01 sequence is .1*.9; the two are exactly equal so the output is unbiased. If the inputs are unbiased, then the four input combinations 00 01 10 and 11 are equally likely and on average you get two bits out for eight in. Any bias reduces this ratio; with large bias the technique can be quite inefficient. In our example with 90% bias, 81% of pairs are 11 and 1% are 00; these are thrown away. On average, 200 bits of input (100 pairs) give only 18 output bits. More problematic in some applications, there is some chance of getting a long run of 11 inputs and generating no output at all for an appreciable time. Also, if the input bits are correlated rather than independent, the technique works less well.

Another common technique is to apply a cryptographic hash to the input. Suppose we estimate that there is about 1 bit of randomness per input byte and our hash is SHA-1 with 160-bit output. Hash 160 input bytes and we should have about 160 bits of randomness; the whole hash output should be unbiased and adequately random. In practice, one would hash more than 160 bytes to give a safety margin, and might use only part of the hash output as random output to avoid giving an enemy enough information to determine the internal state of the hash.

In theory, any high-grade compression algorithm could be substituted for the hash provided any non-random headers are thrown away; well-compressed data is very close to random. However, this technique is rare in practice. To be confident that it was secure, it would need thorough cryptographic analysis of the compression methods; the general practice is therefore to just use a cryptographic hash that has already undergone analysis.

Often a buffer is used, so the sequence is something like: There are endless variations on this. Many operating systems provide a random device as a source of random numbers for applications. In these, the process is asynchronous, with input data mixed in as it arrives and output produced as required; care must be taken to ensure that there is enough randomness to support the output. All such devices use a hash to mix the data, and some use a cipher for output processing as well &mdash; the Yarrow generator uses a block cipher in counter mode while the Open BSD device uses a stream cipher.
 * 1) mix some input into the buffer
 * 2) hash the buffer to get 160 bits of hash output
 * 3) use 64 bits as random output
 * 4) mix the other 96 hash output bits back into the buffer

Various sources may be used to provide (partly) random inputs to the process. A geiger counter or a radio tuned so it only gets static are possible sources. A digital camera can be used, either pointed at some random physical process or pointed at a plain background to use its circuit noise. Silicon Graphics built one called lavarand using a video camera pointed at a group of lava lamps; the descendant lavarnd uses a digital camera with the lens cap on. A microphone can be used in much the same two ways. The Turbid generator is a sophisticated design using a sound card, without a microphone, for input. There is some research work on laser-based sources with very high data rates.

Hardware generators can be designed with internal sources of random bits. Thermal noise in diodes is often used. An alternative is to run two oscillators with different frequencies and without good frequency stability on either. The slower one acts as a clock, determining when to take the faster one's state as an output bit. The frequency ratio is typically around 100:1, so quite minor variations in the slow one have large effects on choice of data from the fast one.

A real generator will often combine several of these techniques. For example, the RNGs built into some Intel chipsets use two oscillators, and modulate the frequency of the slower one with thermal noise. The hardware includes a de-skewing operation based on Von Neumann's technique and the software driver uses SHA-1 hashing.

Pseudorandom number generators
In practical computing, it is convenient to use a pseudorandom number generator (PRNG), which produces a sufficiently varying sequence that, for example, a computer game driven with that generator appears to produce different characteristics each time that it is played. Nevertheless, most PRNGs that are operating system utilities, such as UNIX, will eventually repeat the sequence over a sufficiently long period of time. In addition, the programmer may have the choice of giving an explicit initialization value, which may be called a seed or a nonce, to the PRNG.

Most general-purpose PRNGs will produce the same sequence as long as they are given the same seed, which can even be useful for some software development purposes (see pitfalls in computer simulation, or for such things as being sure that a series of psychological research volunteers all see the same set of events. A given PRNG, however, may be told to use some reasonably random physical event in the computer as the seed, such that the seed is unpredictable.

Donald Knuth, an authority on PRNGs, quoted John von Neumann in a tongue-in-cheek but realistic view of the limits of PRNGs :

It is possible to write very good and very bad pseudorandom number generators. Known pitfalls need to be avoided, both in initialization and in computing the next number.

Known bad ideas
Any PRNG must be initialised with a seed. If you have any concern about someone attacking your generator, such seeds must be truly random. Using something like your computer's clock or your process ID, or even a combination of such things, makes it dangerously easy for an enemy to guess the value and break your generator. In a well-known example, Ian Goldberg and David Wagner broke an early version of Netscape's SSL server in exactly this way. Using another PRNG to provide a seed only moves the problem without solving it, since that PRNG must also be intialised.

Several common types of PRNG &mdash; such as linear congruential generators or linear feedback shift registers &mdash; cannot be used directly in adversarial applications. With well-chosen parameters, they have reasonable statistical properties and are useful for such things as controlling Monte Carlo simulations. However, in the presence of an adversary &mdash; someone who wants to break your cryptosystem or cheat at your game &mdash; they are inadequate, easily broken. They might be used as components in some more complex generator, but they cannot safely be used alone in such applications.

"Cryptanalytic Attacks on Pseudorandom Number Generators" by Kelsey, Schneier, Wagner, and Hall enumerates attacks and assesses vulnerabilities in widely used generators.

Linear congruential
Knuth illustrates the most commonly used PRNG, which he attributes to D.H. Lehmer in 1949. This is an easy generator to implement in software and is very widely used. For example, the rand function in most C libraries works this way.

A linear congruential generator creates successive outputs xn for n = 1,2, 3... using the formula: xi = a * xi-1 + b modulo m The seed is the initial value x0.

Choice of the constants a, b and the modulus m is critical; bad choices can give very weak generators.

The period of such a generator is at most m-1. a, b and m are normally chosen so that this maximum is achieved. Schneier provides a table with recommended choices for a, b, m.

Shift registers
Many random number generators based on shift registers with feedback are possible. The commonest type uses linear feedback shift registers. These are very easy to implement in hardware and reasonable in software. They are commonly used in stream ciphers; we describe some in the stream cipher article.

OFB and CTR Sequences
A block cipher can provide a good pseudo-random sequence; the modes of operation for this are Output Feedback (OFB) or Counter (CTR) mode.

Encryption devices can be used as practical and strong PRNGs, if the output applies some type of masking so an adversary cannot know the full internal state of the generator.

Blum Blum Shub
This is the strongest known generator, with the advantage of relative simplicity, but the disadvantage of being computationally intensive. If it is used to generate numbers that are not needed frequently, it may be useful.

The possibility also exists that one or more processors could be dedicated to BBS computation.

The possibility of pseudo-random methods that are adequately random
Some research indicates that in carefully selected situations, a pseudo-random number generator, which meets a number of cryptographic criteria, may be adequately random for security. The proofs are complex and not universally accepted, but may be true for a useful number of situations.

Random numbers that follow particular statistical distributions
In many applications, the exact sequence is desired to be as random as possible, but the sequence may be under additional constraints to meet specific statistical requirements.

See generating values for normal random variables for an example of generating random numbers that fit a desired distribution.

Testing for Randomness
Knuth presents an extensive range of tests, of which a few are mentioned here.

It is entirely reasonable to have individual numbers recur in a random sequence &mdash; a random sequence may even have a "run" of a number such as 100 354 972 972 972 155 579. Indeed, when typists have been asked to type randomly, they may intuitively avoid runs and produce a less random sequence. However, one statistical test for randomness is to look at the frequency of runs.

Frequency test
The simplest test is to chop the output up into conveniently-sized chunks and look at frequency of possible chunk values over time. For example, if bytes are the unit, all 256 possible values should occur and their frequencies should be approximately equal.

Computer simulation
If it is desired to do repeated runs of a simulation, either for software testing or to evaluate variable components under repeatable condition, a PRNG that will reuse the same seed to produce the same sequence is useful.

Cryptography
Cryptographic one-time pads preferably are generated from physical random phenomena. For high-volume ciphers, however, reproducibility is needed, so, while the initialization variables tend to be truly random, a key generator for a stream or block cipher uses a demonstrably strong pseudo-random number generator.