Internet Security ‘Undermined By Random Number Weaknesses’

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Follow on: Google +

The random numbers used by web cryptographic systems aren’t random enough, say security researchers

The random-number generation systems that underlie the Internet’s security are far weaker than they should be and don’t receive the attention they deserve, according to a presentation at the Black Hat security conference in Las Vegas.

While efforts to bolster security have focused on encryption systems, security professionals aren’t as aware as they should be of the random numbers that “underwrite the security of every modern communications system, including the Internet”, according to researchers Bruce Potter and Sasha Wood.


Entropy drain

They took a closer look at the systems commonly used to provide random numbers for Internet security systems, and found “truly surprising” weaknesses.

The problems centre on the way these systems – called pseudo random number generators (PRNGs) – provide themselves with “entropy”, a term that in this context designates data that, while not truly random, is disordered enough to serve as such in processes such as the generation of secure keys.

Typically, PRNGs gather this jumble of data from actions such as mouse movements and keyboard strokes, feeding it into a “pool” that is drawn on in the creation of random numbers.

However, Potter and Wood found that web servers typically have a low reserve of entropy, around 128 bits – partly because of high kernel demands on the pool and also in part due to low rates of entropy going into the pool.

The problem is compounded by the way encryption systems handle random number generation, the researchers said. For instance, they found that OpenSSL, an implementation of SSL/TLS commonly used on web servers, only pulls data from the pool once per runtime, and doesn’t verify how disordered the data is, assuming it is acceptable.

Crypto risk

“This means that depending on the state of the kernel entropy pool when OpenSSL is started, little to no new entropy will be contained in the random numbers provided,” they wrote in their research paper. “For long running processes, such as servers that link to the OpenSSL libraries, it means that all PRNG operations performed for the duration of the server only have as much entropy as was available at the invocation of the OpenSSL library.”

Industry observers noted that other implementations of SSL, such as Google’s OpenSSL-based BoringSSL, do regularly gather more entropy.

The issue is worth closer scrutiny because of its fundamental importance for encryption, said Potter and Wood.

“Nearly every crypto system relies heavily on access to high quality random numbers,” they wrote. “If the random numbers used in these crypto systems aren’t truly random (or at least random enough to withstand cryptanalytic scrutiny), then the security the algorithm provides can be completely compromised.”

At the conference they released an open source program called libentropy that provides an interface for managing sources of entropy and reporting the status of entropy creation and utilisation.

Are you a security pro? Try our quiz!