Kim Dotcom’s Mega Fileshare Service Riddled With Security Holes

Is Mega handing encryption keys to users to protect itself from future legal problems?

Security experts have found a host of security vulnerabilities in Kim Dotcom’s new online storage venture Mega, but many suspect his claims of  tough data protection were only a smokescreen to distract attention by law enforcement agencies.

Mega, a follow-up to Dotcom’s Megaupload service shut down by law enforcement, launched on Sunday. Its founder boasted it was “the privacy company”, offering 50GB of free online storage to every user and blanket encryption across the site.

Yet many potential security vulnerabilities have been highlighted by the community, including flawed encryption key handling, a cross-site scripting hole and problematic claims surrounding deduplication.

Mega uploading insecure stuff?

The encryption “is less than ideal”, according to Alan Woodward, from the Department of Computing at the University of Surrey. That’s largely because it is all done through Javascript in the browser, which means that anyone who can break the SSL encryption on Mega could get hold of the keys.

The SSL encryption being used on some Mega domains appears to be 1024-bit encryption, which can be broken with far greater ease than 2048-bit encryption – viewed as best-practice amongst experts. At least one of Mega’s sites uses only 1024-bit encryption to “reduce CPU load”, although uses 2048-bit.

Furthermore, if a hacker gained control of the Mega server they could either just turn off the encryption or get hold of the private key to decrypt users’ files. Even though Mega says it doesn’t hold the keys, Mega admins could get hold of them, as it would only take a minor code change on Mega’s servers to access those keys.

Some are concerned US law enforcement agencies, who are already trying to extradite Kim Dotcom over Megaupload’s alleged support for copyright infringement, will simply order Mega to hand over the keys.

“I can imagine the FBI are typing up their warrant as I type requiring the keys to be collected and the content of the servers to be analysed,” Woodward told TechWeekEurope.

Deduplication is another problematic issue. As was seen with Nokia recently, deduplication of encrypted data requires that information to be decrypted, repackaged and then encrypted again. The case of Mega is different, as it claims it can identify duplicated files, hinting the content may not be entirely secret.

According to Mega’s co-founder Bram van der Kolk, speaking to Forbes, it is looking at the entire encrypted file. So if a user uploads the same file encrypted with the same key twice, or if a copied file from a file manager is uploaded, one of those duplicate files will be deleted.

But the fact that Mega uses Javascript’s pseudorandom-number generator to produce the RSA keys, so users can share information with each other using public and private key infrastructure, is also an issue, as it is a method known to be predictable and lacking in entropy.

Despite the numerous security issues, many have questioned whether Mega isn’t simply boasting about its security to protect itself from law enforcement action. If it does not own the keys, and does not access user data, it will be oblivious to the legality of content uploaded to Mega.

“I think Mega is using encryption not for the security of their users but their own personal legal protection,” Woodward added.

“I cannot imagine anyone who understands encryption would trust their precious data to Mega’s scheme as it currently stands.  It would appear that Mega is after people who are looking for somewhere to store their data with a provider who wishes to adopt a position of ‘see no evil’.”

The password problem remains both a security and usability issue, but Mega has promised to let users reset passwords soon. Currently, if they forget their password, they can wave goodbye to their files, regardless of the level of encryption.

UPDATE: Co-founder of Mega, Bram ven der Kolk, promised TechWeekEurope an update from Mega on the issues and has delivered in a company blog.

He confirmed the user password is used to create the master AES-128 master encryption key, which is held on Mega servers, which in turn unlocks private keys and the user’s content. This means Mega could indeed be ordered to find keys and then open up user content.

It also means that the lack of password reset is a big problem – if you forget your password you will have lost the key, meaning even if you got a new password, it wouldn’t recover your files. But van der Kolk has promised some changes.

“A password change feature will re-encrypt the master key with your new password and update it on our servers,” he explained. “A password reset mechanism will allow you to log back into your account, with all files being unreadable.

“Now, if you have any pre-exported file keys, you can import them to regain access to those files. On top of that, you could ask your share peers to send you the share-specific keys, but that’s it – the remainder of your data appears as binary garbage until you remember your password.”

To deal with the JavaScript pseudorandom-number key generation, used to produce a 2048-bit RSA key pair during sign-up for sharing files, Mega will add a feature that allows the user to add as much entropy manually as they see fit before proceeding to the key generation. That’s on top of the entropy provided by user mouse movements and key inputs.

Answering the SSL questions, the Mega co-founder added that a JavaScript verification system had been created to check code uploaded from 1024-bit encrypted connections against 2048-bit-protected connections, to ensure content hasn’t been changed.

“All active content loaded from these ‘insecure’, static servers is integrity-checked by JavaScript code loaded from the ‘secure’ static server, rendering manipulation of the static content or man-in-the-middle attacks ineffective. The only reason why HTTPS is supported/used at all is that most browsers don’t like making HTTP connections from HTTPS pages. And, using more than 1024-bit would just waste a lot of extra CPU time on those static servers.

“A piece of JavaScript coming from a trusted, 2048-bit HTTPS server is verifying additional pieces of JavaScript coming from untrusted, HTTP/1024-bit HTTPS servers. This basically enables us to host the extremely integrity-sensitive static content on a large number of geographically diverse servers without worrying about security.”

How well do you know Internet security? Try our quiz and find out!