Hardware random number generator

From Wikidoc - Reading time: 15 min


In computing, a hardware random number generator is an apparatus that generates random numbers from a physical process. Such devices are often based on microscopic phenomena such as thermal noise or the photoelectric effect or other quantum phenomena. These processes are, in theory, completely unpredictable, and the theory's assertions of unpredictability are subject to experimental test. A quantum-based hardware random number generator typically contains an amplifier to bring the output of the physical process into the macroscopic realm, and a transducer to convert the output into a digital signal.

Random number generators can also be built from macroscopic phenomena, such as playing cards, dice, and the roulette wheel. The presence of unpredictability in these phenomena can be justified by the theory of unstable dynamical systems and chaos theory. These theories suggest that even though macroscopic phenomena are deterministic in theory under Newtonian mechanics, real-world systems evolve in ways that cannot be predicted in practice because one would need to know the micro-details of initial conditions and subsequent manipulation or change.

Although dice have been mostly used in gambling, and in recent times as 'randomizing' elements in games (e.g. role playing games), the Victorian scientist Francis Galton described a way to use dice to explicitly generate random numbers for scientific purposes, in 1890. Though some gamblers believe they can control their throws of dice enough to win at dice games (a claim which has been long debated), no one has produced a way to exploit the claimed effect in either generating or attacking physical randomness sources.

Hardware random number generators are often relatively slow, and they may produce a biased sequence (i.e., some values are more common than others). Whether a hardware random number generator is suitable for a particular application depends on the details of both the application and the generator.

Pseudo-random generators[edit | edit source]

Most computer "random number generators" are not hardware devices, but are software routines implementing generator algorithms. They are often supplied as library routines in programming language implementations, or as part of the operating system. These are more properly called pseudo-random number generators, since, being finite state machines, they cannot produce truly random outputs. Within that limitation, some produce sequences which pass one or several statistical pattern tests for randomness, and may be useful in many circumstances. Several common ones are very poor, not only failing to pass many of the relevant statistical tests, but also being readily, even trivially, predictable.

Algorithmic information theory defines a sequence of bits as non-random if it can be produced by some computer program that is shorter than that sequence (Chaitin-Kolmogorov randomness). Pseudo-random number generators fail that test dramatically. They can usually be programmed in a few thousand bits, but can produce far larger sequences, with periods so long no currently plausible computer or combination of them could exhaust a single period before the heat death of the Universe.

There are several informal definitions of randomness, usually based on either a lack of discernible patterns in a sequence, or the unpredictability of the sequence or various aspects of it by, generally, the most puissant possible adversary. Output from well-designed pseudo-random number generators should pass assorted statistical tests probing for non-randomness (see NIST Special Publication 800-22, Knuth, The Art of Computer Programming, vol. 2, and RFC 4086 for details of many such tests).

The sequences such generators produce always have patterns, in one sense, since the algorithm that generates them has a finite state, and must, eventually, repeat one of those states. Given the original state of the generator, and the implementation of the algorithm, a pseudo-random number generator of this sort is totally predictable. Given even partial knowledge of that state, they are often insecure for many purposes. On the other hand, it has become relatively easy to produce pseudo-random number generators that are guaranteed not to repeat on any currently plausible computer within a time-frame that is millions of times longer than the age of the universe. It is an open question whether it is always possible, in some practical way, to distinguish the output of such a pseudo-random number generator from that of a perfectly random source, without knowledge of the generator's internal state. Much of modern cryptography rests on the assumption that ciphers can be constructed whose output is indistinguishable from random noise without knowledge of a secret key used in the algorithm.

Uses[edit | edit source]

Unpredictable random numbers were first investigated in the context of gambling, and many randomizing devices such as dice, shuffling playing cards, and roulette wheels, were first developed for use in gambling. Fairly produced random numbers are vital to electronic gambling and ways of creating them are sometimes regulated by governmental gaming commissions.

Random numbers are also used for non-gambling purposes, both where their use is mathematically important, such as sampling for opinion polls, and in situations where fairness is approximated by randomization, such as selecting jurors and military draft lotteries. Their use is also evident in ancient writings, for example in the Book of Numbers (33:54), Moses commands the Israelites to apportion the land by lot (Template:Lang-he). And the drawing of lots, often pottery shards, are well attested in the Classical world of Greece and Rome. Some of these very shards have been archeologically recovered[citation needed].

Early attempts[edit | edit source]

One early way of producing random numbers was by a variation of the same machines used to play keno or select lottery numbers. Basically, these mixed numbered ping-pong balls with blown air, perhaps combined with mechanical agitation, and use some method to withdraw balls from the mixing chamber. This method gives reasonable results in some senses, but the random numbers generated by this means are expensive. The method is inherently slow, and is unusable in most automated situations, (i.e., with computers).

In 1927, Cambridge University Press published a book of Random sampling numbers, arranged by a statistician, Leonard Henry Caleb Tippett, which contained 41,600 digits taken from English parishes listed in census records. Other random number tables were published in that era, including one by R. A. Fischer and another by the U.S. Interstate Commerce Commission in 1949 with over 100,000 random digits.

Template:Cleanup-section

What is likely to be the last of these projects is A Million Random Digits with 100,000 Normal Deviates, published by the RAND Corporation in 1955. They created an electronic simulation of a roulette wheel attached to a key punch, the results of which were then carefully filtered and tested before being used to generate the table. The RAND table was a significant breakthrough in delivering random numbers because such a large and carefully prepared table had never before been available. It was useful for simulations and modeling. But, having been published, it is not usable as cryptographic keys, or for preparing them, or as a seed in some cryptographic pseudo-random number generator. However, since it was published long before modern cryptography, using values from it for the random (if not unknown) constants for initializing algorithms would demonstrate that the constants had not been selected for (in B. Schneier's words) "nefarious purpose(es)." Khufu and Khafre do this, for example (ref Applied Cryptography - B. Schneier). See: Nothing up my sleeve numbers.

Physical phenomena with random properties[edit | edit source]

There are two fundamental sources of practical quantum mechanical physical randomness: quantum mechanics at the atomic or sub-atomic level and thermal noise (some of which is quantum mechanical in origin). Quantum mechanics predicts that certain physical phenomena, such as the nuclear decay of atoms, are fundamentally random and cannot, in principle, be predicted. (For a discussion of empirical verification of quantum unpredictability, see Bell test experiments.) And, because we live at a finite, non-zero temperature, every system has some random variation in its state; for instance, molecules of air are constantly bouncing off each other in a random way. (See statistical mechanics.) This randomness is a quantum phenomenon as well. (See phonon.)

Because the outcome of quantum-mechanical events cannot in principle be predicted, they are the 'gold standard' for random number generation. Some quantum phenomena used for random number generation include:

  • Shot noise, a quantum mechanical noise source in electronic circuits. A simple example is a lamp shining on a photodiode. Due to the uncertainty principle, arriving photons create noise in the circuit. Collecting the noise for use poses some problems, but this is an especially simple random noise source.

Thermal phenomena are easier to detect. They are (somewhat) vulnerable to attack by lowering the temperature of the system, though most systems will stop operating at temperatures (e.g., ~150 K) low enough to reduce noise by a factor of two. Some of the thermal phenomena used include:

  • Atmospheric noise, detected by a radio receiver attached to a PC (though much of it, such as lightning noise, is not properly thermal noise, but most likely a chaotic phenomenon).

Another variable physical phenomenon that is easy to measure is clock drift.

In the absence of quantum effects or thermal noise, other phenomena that tend to be random, although in ways not easily characterized by laws of physics, can be used. When several such sources are combined carefully (as in, for example, the Yarrow algorithm or Fortuna CSPRNGs), enough entropy can be collected for the creation of cryptographic keys and nonces, though generally at restricted rates. The advantage is that this approach needs, in principle, no special hardware. The disadvantage is that a sufficiently knowledgable attacker can surreptitiously modify the software or its inputs, thus reducing the randomness of the output, perhaps substantially. The primary source of randomness typically used in such approaches is the precise timing of the interrupts caused by mechanical input/output devices, such as keyboards and disk drives, various system information counters, etc.

This last approach must be implemented carefully and may be subject to attack if it is not. For instance, the generator built into the Linux kernel, which combines several such sources, may be vulnerable to an attack [1]. The random number generator used for cryptographic purposes in an early version of the Netscape browser was certainly vulnerable (and was promptly changed).

One approach in using physical randomness is to convert a noise source into a random bit sequence in a separate device that is then connected to the computer through an I/O port. The acquired noise signal is amplified, filtered, and then run through a high-speed voltage comparator to produce a logic signal that alternates states at random intervals. At least in part, the randomness produced depends on the specific details of the 'separate device'. Care must also always be taken when amplifying low-level noise to keep out spurious signals, such as power line hum and unwanted broadcast transmissions, and to avoid adding bias during acquisition and amplification. In some simple designs, the fluctuating logic value is converted to an RS-232 type signal and presented to a computer's serial port. Software then sees this series of logic values as bursts of "line noise" characters on an I/O port. More sophisticated systems may format the bit values before passing them into a computer.

Another approach is to feed an analog noise signal to an analog to digital converter, such as the audio input port built into most personal computers. The digitized signal may then be processed further in software to remove bias. However, digitization is itself often a source of bias, sometimes subtle, so this approach requires considerable caution and care.

Some have suggested using digital cameras, such as webcams, to photograph chaotic macroscopic phenomena. A group at Silicon Graphics imaged Lava lamps to generate random numbers. U.S. Patent 5,732,138 One problem was determining whether the chaotic shapes generated were actually random -- the team decided that they are in properly operating Lava lamps. Other chaotic scenes could be employed, such as the motion of streamers in a fan air stream or, probably, bubbles in a fish tank (fish optional). The digitized image will generally contain additional noise, perhaps not very random, resulting from the video to digital conversion process. A higher quality device might use two sources and eliminate signals that are common to both— depending on the sources and their physical locations, this reduces or eliminates interference from outside electric and magnetic fields. This is often recommended for gambling devices, to reduce cheating by requiring attackers to exploit bias in several "random bit" streams.

The Commodore C64 provided a hardware random number generator, included in its soundchip, the MOS Technology SID 6581. Random bytes are fetchable by a read on the correct memory address on the 6581.

Clock drift[edit | edit source]

Template:Mergeto

There are several ways to measure and use clock drift as a source of randomness.

The Intel 80802 Firmware Hub chip included a hardware RNG using two free running oscillators, one fast and one slow. A thermal noise source (non-commonmode noise from two diodes) is used to modulate the frequency of the slow oscillator, which then triggers a measurement of the fast oscillator. That output is then debiased using a von Neumann type decorrelation step (see below). The output rate of this device is somewhat less than 100,000 bit/s. This chip was an optional component of the 840 chipset family that supported an earlier Intel bus. It is not included in modern PCs.

All VIA C3 microprocessors have included a hardware RNG on the processor chip since 2003. Instead of using thermal noise, raw bits are generated by using four freerunning oscillators which are designed to run at different rates. The output of two are XORed to control the bias on a third oscillator, whose output clocks the output of the fourth oscillator to produce the raw bit. Minor variations in temperature, silicon characteristics, and local electrical conditions cause continuing oscillator speed variations and thus produce the entropy of the raw bits. To further ensure randomness, there are actually two such RNGs on each chip, each positioned in different environments and rotated on the silicon. The final output is a mix of these two generators. The raw output rate is tens to hundreds of megabits per second, and the whitened rate is a few megabits per second. User software can access the generated random bit stream using new non-privileged machine language instructions.

A software implementation of a related idea on ordinary hardware is included in CryptoLib, a cryptographic routine library (JB Lacy, DP Mitchell, WM Schell, CryptoLib: Cryptography in software, Proc 4th USENIX Security Symp, pg 1-17, 1993). The algorithm is called truerand. Most modern computers have two crystal oscillators, one for the real-time clock and one for the primary CPU clock; truerand exploits this fact. It uses an operating system service that sets an alarm, running off the real-time clock. One subroutine sets that alarm to go off in one clock tick (usually 1/60th of a second). Another then enters a while loop waiting for the alarm to trigger. Since the alarm will not always trigger in exactly one tick, the least significant bits of a count of loop iterations, between setting the alarm and its trigger, will vary randomly, possibly enough for some uses. Truerand doesn't require additional hardware, but in a multi-tasking system great care must be taken to avoid non-randomizing interference from other processes (e.g., in the suspension of the counting loop process as the operating system scheduler starts and stops assorted processes).

Dealing with bias[edit | edit source]

The bit-stream from such systems is prone to be biased, with either 1s or 0s predominating. There are two approaches to dealing with bias and other artifacts. The first is to design the RNG to minimize bias inherent in the operation of the generator. One method to correct this feeds back the generated bit stream, filtered by a low-pass filter, to adjust the bias of the generator. By the central limit theorem, the feedback loop will tend to be well-adjusted 'almost all the time'. Ultra-high speed random number generators often use this method. Even then, the numbers generated are usually somewhat biased.

Software whitening[edit | edit source]

A second approach to coping with bias is to reduce it after generation (in software or hardware). Even if the above hardware bias reduction steps have been taken, the bit-stream should still be assumed to contain bias and correlation. There are several techniques for reducing bias and correlation, often known by the name "whitening" algorithms, by analogy with the related problem of producing white noise from a correlated signal.

John von Neumann invented a simple algorithm to fix simple bias, and reduce correlation. It considers bits two at a time, taking one of three actions: when two successive bits are equal, they are not used as a random bit; a sequence of 1,0 becomes a 1; and a sequence of 0,1 becomes a zero. This eliminates simple bias, and is easy to implement as a computer program or in digital logic. This technique works no matter how the bits have been generated. It cannot assure randomness in its output, however. What it can do (with significant numbers of discarded bits) is transform a random bit stream with a frequency of 1's different from 50% into a stream closer to that frequency.

Another technique for improving a near random bit stream is to exclusive-or the bit stream with the output of a high-quality cryptographically secure pseudorandom number generator such as Blum Blum Shub or a good stream cipher. This can cheaply improve decorrelation and digit bias.

A related method which reduces bias in a near random bit stream is to take two or more uncorrelated near random bit streams, and exclusive or them together. Let the probability of a bit stream producing a 0 be 1/2 + e, where -1/2 ≤ e ≤ 1/2. Then e is the bias of the bitstream. If two uncorrelated bit streams with bias e are exclusive-or-ed together, then the bias of the result will be 2e². This may be repeated with more bit streams. (See also Piling-up lemma).

Some designs apply cryptographic hash functions such as MD5, SHA-1, or RIPEMD-160 or even a CRC function to all or part of the bit stream, and then use the output as the random bit stream. This is attractive, partly because it is relatively fast compared to some other methods, but depends entirely on qualities in the hash output for which there may be little theoretical basis.

PRNG with periodically refreshed random key[edit | edit source]

Other designs use what are believed to be true random bits as the key for a high quality block cipher algorithm, taking the encrypted output as the random bit stream. Care must be taken in these cases to select an appropriate block mode, however. In some implementations, the PRNG is run for a limited number of digits, while the hardware generating device produces a new seed.

Using observed events[edit | edit source]

Software engineers without true random number generators often try to develop them by measuring physical events available to the software. An example is measuring the time between user keystrokes, and then taking the least significant bit (or two or three) of the count as a random digit. A similar approach measures task-scheduling, network hits, disk-head seek times and other internal events. One Microsoft design includes a very long list of such internal values (see the CSPRNG article).

The method is risky when it uses computer-controlled events because a clever, malicious attacker might be able to predict a cryptographic key by controlling the external events. Several gambling frauds have been uncovered which rely on manipulating normally hidden events internal to the operation of computers or networks.[citation needed] It is also risky because the supposed user-generated event (e.g., keystrokes) can be spoofed by a sufficiently ingenious attacker, allowing control of the "random values" used by the cryptography.

However, with sufficient care, a system can be designed that produces cryptographically secure random numbers from the sources of randomness available in a modern computer. The basic design is to maintain an "entropy pool" of random bits that are assumed to be unknown to an attacker. New randomness is added whenever available (for example, when the user hits a key) and an estimate of the number of bits in the pool that cannot be known to an attacker is kept. Some of the strategies in use include:

  • When random bits are requested, return that many bits derived from the entropy pool (by a cryptographic hash function, say) and decrement the estimate of the number of random bits remaining in the pool. If not enough unknown bits are available, wait until enough are available. This is the top-level design of the "/dev/random" device in Linux, written by Theodore Ts'o and used in many other Unix-like operating systems. It provides high-quality random numbers so long as the estimates of the input randomness are sufficiently cautious. The Linux "/dev/urandom" device is a simple modification which disregards estimates of input randomness, and is therefore rather less likely to have high entropy as a result.
  • Maintain a stream cipher with a key and IV obtained from an entropy pool. When enough bits of entropy have been collected, replace both key and IV with new random values and decrease the estimated entropy remaining in the pool. This is the approach taken by the yarrow library. It provides resistance against some attacks and conserves hard-to-obtain entropy.

Problems[edit | edit source]

It is very easy to misconstruct hardware or software devices which attempt to generate random numbers. Also, most 'break' silently, often producing decreasingly random numbers as they degrade. A physical example might be the rapidly decreasing radioactivity of the smoke detectors mentioned earlier. Failure modes in such devices are plentiful and are complicated, slow, and hard to detect.

Because many entropy sources are often quite fragile, and fail silently, statistical tests on their output should be performed continuously. Many, but not all, such devices include some such tests into the software that reads the device.

Just as with other components of a cryptosystem, a software random number generator should be designed to resist certain attacks. Defending against these attacks is difficult. See: random number generator attack.

Estimating entropy[edit | edit source]

There are mathematical techniques for estimating the entropy of a sequence of symbols. None are so reliable that their estimates can be fully relied upon; there are always assumptions which may be very difficult to confirm. These are useful for determining if there is enough entropy in a seed pool, for example, but they cannot, in general, distinguish between a true random source and a pseudo-random generator.

Performance checks[edit | edit source]

Hardware random number generators should be constantly monitored for proper operation. RFC 4086 and FIPS Pub 140-2 include tests which can be used for this. Also see the documentation for the New Zealand cryptographic software library cryptlib.

Since many practical designs rely on a hardware source as an input, it will be useful to at least check that the source is still operating. Statistical tests can often detect failure of a noise source, such as a radio station transmitting on a channel thought to be empty, for example. Noise generator output should be sampled for testing before being passed through a "whitener." Some whitener designs can pass statistical tests with no random input. While detecting a large deviation from perfection would be a sign that a true random noise source has become degraded, small deviations are normal and can be an indication of proper operation. Correlation of bias in the inputs to a generator design with other parameters (e.g., internal temperature, bus voltage) might be additionally useful as a further check. Unfortunately, with currently available (and foreseen) tests, passing such tests is not enough to be sure the output sequences are random. A carefully chosen design, verification that the manufactured device implements that design and continuous physical security to insure against tampering may all be needed in addition to testing for high value uses.

See also[edit | edit source]

External links[edit | edit source]

Code[edit | edit source]

References[edit | edit source]

it:Generatore hardware di numeri casuali

Template:WS


Licensed under CC BY-SA 3.0 | Source: https://www.wikidoc.org/index.php/Hardware_random_number_generator
9 views | Status: cached on September 17 2024 22:55:41
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF