Coronavirus: NHS Contact Tracing App Has Flaws, Aussie Researchers Warn

Mobile AppsMobilityRegulationSecuritySecurity ManagementSurveillance-IT
coronavirus Image credit: World Health Organisation

Security flaws in the NHS coronavirus contact-tracing app could pose risk to user privacy, as the NCSC promises to fix the issues

Two researchers in Australia have warned of a number of security flaws they discovered in the NHS Coronavirus contact tracing app.

GCHQ’s National Cyber Security Centre (NCSC) told the BBC it was already aware of most of the issues raised by the State of IT, and is in the process of addressing them.

That app is currently is currently being trialled on the Isle of Wight, after it was initially tested on an RAF base in North Yorkshire.

Image credit: Apple
Image credit: Apple

Privacy worries

The UK government had recently published the source code its NHS Covid-19 tracing app, amid concern over the app’s privacy and effectiveness.

When someone tests positive for Covid-19, the apps tend to track down whom the patient has been in contact with and isolate them.

The UK’s NHS app for example uses Bluetooth signals to detect and log other phones with a compatible app in the vicinity. When a person develops a confirmed case, the app alerts those who have come into contact with the individual.

But the NHS’ “centralised” approach has come under fire for exposing users to privacy risks and, as a result, potentially making people less willing to use the software.

Matters were not helped for privacy campaigners when it was confirmed that GCHQ had been granted extra powers to obtain security data from NHS systems, in order to better protect it from outside threats.

The NHS app processes anonymised data on a central server, allowing the NHS to track trends in the way the virus is spreading and to detect hotspots.

That approach contrasts to the “decentralised” approach adopted by many other countries, where all data processing is carried out on the devices themselves.

Apple and Google are developing a decentralised technology that is to be built into iOS and Android devices, and the method has been widely adopted across Europe and elsewhere.

France and Japan are two notable exceptions, with both countries opting to employ centralised servers.

Australian concerns

But now security researchers in Australia have warned of wide-ranging security flaws, and said the problems pose risks to users’ privacy and could be abused to prevent contagion alerts being sent.

In their research, the researchers opened up about a number of problems with the app.

“The following security analysis was conducted via a static analysis of the released source code for the UK’s Covid-19 Contact tracing Android app and an evaluation of high-level design documents,” wrote the researchers.

“An earlier version of this report was shared with the NCSC on 12 May 2020,” they said. “We would like to thank NCSC for their rapid response to our report and the constructive dialogue that has taken place since.”

The problems found by the researchers include weaknesses in the registration process that could allow attackers to steal encryption keys. This could prevent users from being notified if a contact tested positive for Covid-19. Or it could result in the creation of a false alert.

Another problem stems from the fact that the data is stored unencrypted on handsets that could potentially be used by law enforcement to determine when two or more people met.

Last week EU officials warned that contact-tracing apps must only be used during the pandemic and will need to be automatically de-activated once the crisis is over.

The UK however is no longer a member of the EU.

Another issue the researchers discovered is the app generates a new random ID code for users once a day rather than once every 15 minutes as is the case in a rival model developed by Google and Apple.

“The risks overall are varied,” Dr Chris Culnane, the second author of the report, told BBC News.

“In terms of the registration issues, it’s fairly low risk because it would require an attack against a well protected server, which we don’t think is particularly likely.

“But the risk about the unencrypted data is higher, because if someone was to get access to your phone, then they might be able to learn some additional information because of what is stored on that.”

NCSC technical director Ian Levy wrote a blog, in which he thanked Australian cryptographer Vanessa Teague and Dr Culnane for their work and promised to address the issues they identified.

Do you know all about security? Try our quiz!

Read also :
Author: Tom Jowitt
Click to read the authors bio  Click to hide the authors bio