I first started thinking about the topic of randomness (entropy) versus meaning while at DEF CON 18 in Las Vegas. I was thinking about the people all around me betting actual money on what are really nothing more than fancy random number generators: dice, shuffled cards or bouncing balls on spinning tables of black and red tiles.


Look at that guy throwing the dice; he’s celebrating how non-random he is. And see the way she’s looking at him? She appreciates how he can roll numbers in a non-random way. Yet the billion dollar structure they are celebrating in (the casino) suggests that even if the dice are random, the house has the odds. It is part of human nature to mistakenly perceive meaning where there isn’t any. Las Vegas is built on people taking it to the extreme–thinking that they can see perceive patterns in random data unconsciously and then consciously betting money on it. Craziness.

Computers have the opposite problems as humans. They need entropy but they can’t find it. Every time a computer needs to generate a key, be it a static SSH key or an ephemeral SSL key, it needs dozens or hundreds of bytes of entropy. But where can a computer find this random data? It doesn’t even have dice, shuffled cards or bouncing balls.

Consider a virtual machine booting up for the first time—it needs to generate an SSH host key to enable other devices to connect to it. The ssh-keygen utility queries the operating system’s random number generator (RNG) for the entropy, but there isn’t any. The virtual machine has just come up and hasn’t seen any external data from which it could filter some entropy. So it makes up the random data by seeding a deterministic algorithm (such as RC4 or MD5) and drawing off the output.

The problem is, making up random data is a bad, bad idea. If the initial seed values can be guessed (often they are easy things such as a process ID or timestamp), then the rest of the stream can be determined.

“Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin.” — John Von Neumann (1951)

This fabrication of entropy goes on all the time. In 2012, Apple was worried about the amount of entropy in its iPhones on boot-up, so it switched to a different deterministic RNG algorithmduring boot-up. This “early RNG” had problems, though, resulting in even worse entropy collection. The most popular computer of all time, the iPhone, has not escaped the problem of the lack of entropy.

nadia_headServer-side software may be even more susceptible to this problem as more servers become fully virtualized and therefore even farther removed from their own hardware. Nadia Heninger and her colleagues wrote a disturbing paper, Mining your Ps and Qs: Detection of Widespread Weak Keys in Network Devices, in 2009. They found that between 1 and 2 percent of all SSL keys on the Internet are factorable (and therefore recoverable), largely due to bad entropy. In 2008, a screw-up in the RNG of one of the Debian distributions resulted in only 15 bits of entropy in their keys. This key generation weakness went unnoticed for years. The problem is going to keep happening. According to security luminary Dan Kaminsky, there has been no progress in RNG technology for 25 years.

Is there anything that an you can do about this, besides frantically patching your software and regenerating your keys the next time an open-source RNG turns out to be broken?

Actually, there is something you can do about it.

Your BIG-IP has a real random number generator in it. All BIG-IP hardware platforms contain specialized cryptographic offload chipsets for performing large amounts of SSL processing. This processing requires thousands of ephemeral keys; therefore, the chipset includes a hardware-based random number generator. Most of the BIG-IP platforms use the Cavium Octeon chipsets, whose RNGs can produce 380 Mb/s of random data at a time. The Intel chipsets can produce 2Gb/s of random data at a time. The Intel solution is interesting as they had to develop a whole new digital quantum RNG.

So how can you get to the RNG? BIG-IP doesn’t offer an explicit channel to get at it, but by leveraging the power of iRules, we can make a simple virtual server that will export 1K of random data at a time.

Here’s a link to a simple iRule that mines the hardware RNG for a kilobyte of data.


Download and attach the iRule to a virtual server that is only listening on the internal VLAN. Then nearly any device should be able to reach the virtual server and grab some hardware RNG data periodically.

Once you’ve attached the iRule, you can start grabbing the data and sticking it into the various config files that use it. On many Linux distributions, these files need random data:

  • $HOME/.rnd
  • /var/run/egd_pool
  • /dev/random

Some distributions will let you write directly to /dev/random. And they will also report the amount of randomness the system thinks it has. Stir in data and check like so:

% cat /proc/sys/kernel/random/entropy_avail
% curl > /dev/random
% cat /proc/sys/kernel/random/entropy_avail

Even if there is a failure in this system, it won’t be any worse than what you likely have now. Kernels only add random data; they don’t replace what they already have.


If you sprinkle this throughout the data center, you should have a stronger cryptographic posture. And the next time there’s a wide-spread failure of pseudo-RNG algorithms, you will be one of the few who took pre-emptive steps to combat the lack of entropy. Then you can go back to trying to perceive the meaning in your life.

For a broader analysis of the entropy problem, see my article in SecurityWeek.