Could the UK be about to break end-to-end encryption?



Once again there are indications the UK government intends to use the law to lean on encryption. A report in The Sun this week quoted a Conservative minister saying that should the government be re-elected, which polls suggest it will, it will move quickly to compel social media firms to hand over decrypted data.

The paper quoted an unnamed government minister saying: “The social media companies have been laughing in our faces for too long”, and suggested that all tech companies with more than 10,000 users will face having to significantly adapt their technology to comply with the decryption law.

The relevant Statutory Instrument, to enable UK government agencies to obtain warranted access to decrypted data from communications service providers, currently sitting in draft form, will be voted through parliament within weeks of a new government taking office after the June 8 general election, according to the report.

As is typically the case when strong encryption comes back under political pressure in the modern Internet age, this leaked hint of an impending ‘crack down’ on tech firms came hard on the heels of another terrorist attack in the UK — after a suicide bomber blew himself up at a concert in Manchester on Monday evening. The underlying argument is that intelligence agencies need the power to be able to break encryption to combat terrorism.

Strong encryption, cryptic answers

The problem — as always in this recurring data access vs strong encryption story — is that companies that use end-to-end encryption to safeguard user data are not able to hand over information in a readable form as they do not hold the encryption keys to be able to decrypt it.

So the question remains how can the government compel companies to hand over information they don’t have access to?

Will it do so by outlawing the use of end-to-end encryption? Or by forcing companies to build in backdoors — thereby breaking strong encryption in secret? The latter would arguably be worse since government would be opening app users up to potential security vulnerabilities without letting them know their security is being compromised.

The UK government has been rubbing around this issue for years. At the back end of last year it passed the Investigatory Powers Act, which threw up questions about the looming legal implications for encrypted communications in the UK — owing to a provision that states communications service providers may be required to “remove electronic protection of data”.

It’s those powers that ministers are apparently intending to draw on to break social media firms’ use of strong encryption.

During the scrutiny process of the IP bill last year, ministers led a merry dance around the implications of the “electronic protection” removal clause for e2e encryption. The best interpretation of which was that the government was trying to frame a law that encouraged tech platforms to eschew the use of strong encryption in order not to risk falling outside the scope of an unclear law.

“He seems to be implying that providers can only provide encryption which can be broken and therefore can’t be end-to-end encryption,” was Lord Strasburger’s assessment of the government response to questions on the topic last July.

No clarity has emerged since then. The situation is still ongoing fuzziness about the legality of e2e encryption in the UK. To break or not to break, that is the question?

Arguably, as Strasburger suggested, this is strategic; intentional obfuscation on the part of the UK government — to spread FUD as a strategy to try to discourage use of a technology their intelligence agencies view as a barrier to their work.

But the problem for the government is that use of e2e encryption has been growing in recent years as awareness of both privacy risks and cyber security threats have stepped up, thanks to data breach scandal after data breach scandal, as well as revelations of the extent of government agencies’ surveillance programs following the 2013 Snowden disclosures.

Not holding encryption keys allows tech firms to step outside the controversy related to digital snooping and to bolster the security cred of their services. Yet, as a result, popular services that have championed strong encryption are increasingly finding themselves in the crosshairs of government agencies. Be it the Facebook Messenger app, or Facebook’s WhatsApp messaging platform, or Apple’s iOS and iMessage.

After another terror attack in London in March, UK Home Secretary Amber Rudd was quick to point the finger of blame at social media firms — saying they should not provide “a secret place for terrorists to communicate with each other”, and asserting: “We need to make sure that our intelligence services have the ability to get into situations like encrypted WhatsApp.”

Of course she did not explain how intelligence agencies intended to “get into” encrypted WhatsApp. And that earlier political pressure on encryption morphed into calls for social media firms to be more proactive about removing terrorist content from their public channels. At least publicly. Discussions held vis-a-vis encryption were not made public.

But again, if the latest reporting is to be believed, Rudd is intent on breaking strong encryption after all.

Exceptional access, unacceptable risk 

It’s worth revisiting Keys Under Doormats; aka the paper written by a group of storied security researchers back in 2015, re-examining the notion of so-called “exceptional access” for security agencies to encryption systems — at a time when debate had also been re-ignited by politicians calling for ‘no safe spaces for terrorists’.

The report examined whether it is “technically and operationally feasible to meet law enforcement’s call for exceptional access without causing large-scale security vulnerabilities” — posing the question of whether it’s possible to build in such exceptional access without creating unacceptable risk?

Their conclusion was clear: exceptional access without unacceptable risk is not possible, they wrote. Nor is it clear it would even be feasible given how the services in question criss-cross international borders.

Here’s one key paragraph from the paper:

Designing exceptional access into today’s information services and applications will give rise to a range of critical security risks. First, major efforts that the industry is making to improve security will be undermined and reversed. Providing access over any period of time to thousands of law enforcement agencies will necessarily increase the risk that intruders will hijack the exceptional access mechanisms. If law enforcement needs to look backwards at encrypted data for one year, then one year’s worth of data will be put at risk. If law enforcement wants to assure itself real time access to communications streams, then intruders will have an easier time getting access in real time, too. This is a trade-off space in which law enforcement cannot be guaranteed access without creating serious risk that criminal intruders will gain the same access. Second, the challenge of guaranteeing access to multiple law enforcement agencies in multiple countries is enormously complex. It is likely to be prohibitively expensive and also an intractable foreign affairs problem.

They further concluded:

From a public policy perspective, there is an argument for giving law enforcement the best possible tools to investigate crime, subject to due process and the rule of law. But a careful scientific analysis of the likely impact of such demands must distinguish what might be desirable from what is technically possible. In this regard, a proposal to regulate encryption and guarantee law enforcement access centrally feels rather like a proposal to require that all airplanes can be controlled from the ground. While this might be desirable in the case of a hijacking or a suicidal pilot, a clear-eyed assessment of how one could design such a capability reveals enormous technical and operational complexity, international scope, large costs, and massive risks — so much so that such proposals, though occasionally made, are not really taken seriously.

One thing the paper did not consider is that much politicking can be primarily intended as a theatre of influence for winning votes from spectators.

And the timing of the latest leaked call for ‘decryption on-demand’ coincides with an imminent UK general election, while also serving to shift potential blame for security failures associated with a terrorist attack that took place during the election campaign off of government agencies — and onto a softer target: overseas tech firms.

As we’ve seen amply in recent times, populist arguments can play very well with an electorate. And characterizing social media companies as the mocking, many-headed pantomime villain of the story transforms complex considerations into a basic emotional attack that might well be aimed at feeding votes back to a governing party intent on re-election.

“…to disclose, where practicable… in an intelligible form”

Returning to UK law, the (still draft) ‘Investigatory Powers (Technical Capability) Regulations 2017‘ is the legal route for placing obligations on comms service providers, under the IP Act, to maintain the necessary technical capabilities to afford government agencies the warranted access on demand that they keep demanding.

Yet exactly what those technical capabilities are remains unclear. (And “vague” technical requirements for exceptional access are also raised as a problem in Keys Under Doormats.)

Among the list of obligations Technical Capability Notices can place on comms service providers is the following non-specific clause:

To provide and maintain the capability to disclose, where practicable, the content of communications or secondary data in an intelligible form and to remove electronic protection applied by or on behalf of the telecommunications operator to the communications or data, or to permit the person to whom the warrant is addressed to remove such electronic protection.

The document also sets out that decrypted data must be handed over within a day after a CSP has been served a warrant by a government agency, and that CSPs must maintain the capability to intercept simultaneously comms and metadata for up to 1 in 10,000 of their customers.

The technical details of how any encryption perforations could be achieved are evidently intended to remain under wraps. Which means wider data security risks cannot be publicly assessed.

“I suspect that all the vagueness about concrete technical measures is deliberate, because it allows the government to deal with the technical details within a particular technical capability notice, which would be under a gag order, and thus avoid any public scrutiny by the infosec community,” argues Martin Kleppmann, a security researcher at the University of Cambridge who submitted evidence to the parliamentary committees scrutinizing the IP bill last year. And who has blogged about how the law risks increasing cyber crime. 

“It’s easy to criticize encryption technologies as providing ‘safe spaces’ for terrorists while forgetting that the exact same technologies are crucial for defence against criminals and hostile powers (not to mention protecting civil liberties),” he adds.

“Intelligence agencies don’t seem to actually want bulk access to encrypted data, but merely want the capability to intercept specific targets. However… if a system allows encryption to be selectively circumvented at the command of an intelligence agency, it’s not really end-to-end encryption in a meaningful sense!”

One possibility for enabling ‘exceptional access’ that has sometimes been suggested is a NOBUS: aka a ‘nobody but us’ backdoor — i.e. a backdoor which is mathematically/computationally impossible to find. However Kleppmann points out that even if the math itself is solid, it merely takes one person with knowledge of the NOBUS to leak it — and then, as he puts it, “all mathematical impossibility goes out of the window”.

“The only way of making a system secure against adversaries who want to harm us is by designing it such that there are no known flaws or backdoors whatsoever, and by fixing it if any flaws are subsequently discovered,” he argues.

Meanwhile, on the vulnerability front, Kleppmann notes that even users of services which have open source components — such as WhatsApp, which uses the respected (and independently security reviewed) Signal Protocol for its encryption system — there’s still a requirement for users to trust the company’s servers are doing what they say they are when they hand over keys. Which could offer a potential route for a government-mandated backdoor to be slipped in.

“With WhatsApp/Signal/iMessage there is the remaining problem that you have to trust their server to give you the correct key for the person you want to communicate with,” he says. “Thus, even if the encryption is perfect, if a government agency can force the server to add the government’s special decryption key to your list of device keys, they can still subvert the security of the system. People are working on improving the transparency of key servers to reduce this problem, but we still have a long way to go.”

“I do believe open source is very helpful here,” he adds. “It’s not a silver bullet, but it makes it more difficult to sneak in a backdoor unnoticed.”

Previously, UK government ministers have both claimed they do not want to ban end-to-end encryption nor are demanding that backdoors be built in digital services. Although they have also described the rise of e2e encryption as “alarming“.

When interrogated specifically on the e2e question, the former UK Home Secretary (and now UK Prime Minister) said that companies should take “reasonable steps to ensure that they are able to comply with the warrant that has been served on them”.

Yet — and you might be spotting a pattern here — there has been no definition of what those “reasonable steps” might be.

Therefore it remains unclear where the UK’s legal line will be drawn on encryption.

Backdoors and outlaws

If The Sun‘s story is correct, and UK government-ministers-in-waiting are indeed preparing to demand the likes of WhatsApp and Apple hand over decrypted messages then those “reasonable steps” would presumably require an entire reworking of their respective security systems.

And if the companies don’t bow to such demands what then? Will the UK government move to block access to WhatsApp’s e2e encrypted messaging service? Or ban the iPhone, given that Apple’s iMessages also uses e2e encryption? We just don’t know at this point.

A spokesperson for WhatsApp declined to comment when contacted for a response to this story.

Apple’s press team did not respond to a request for comment either. But the company has a history of strongly defending user privacy — taking to the courts in the US last year to fight the FBI’s demand to weaken iOS security to help facilitate access to a locked iPhone that had been used by a terrorist, for example.

WhatsApp has also had its service blocked multiple times in Brazil after it was taken to court for not handing over decrypted data to law enforcement authorities. Its response? To state in public that it cannot hand over information it does not hold.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *