The NHS has updated the data protection impact assessment (DPIA) for its coronavirus contact-tracing app to include a formerly redacted ‘risks’ section. It scores itself a self-congratulatory low-to-medium risk on all the potential privacy issues raised, indicating that NHSX doesn’t need to seek further ICO consultation on the app.
According to the DPIA, the greatest privacy risks faced by the app – although still assigned a benign shade of amber – are “malicious or hypochondriac” false symptom reporting and children accessing the app and plugging in incorrect symptoms. This would result in people receiving incorrect alerts that they’d been in contact with someone experiencing coronavirus symptoms.
However, as Michael Veale, lecturer in digital rights and regulation at UCL, points out in an analysis, this risk should perhaps be rated more seriously given that it would result in people being inaccurately told they need to remain indoors for two weeks. “Personally, I think it’s hard to imagine risks to rights or freedoms of higher impact than quarantine,” Veale wrote on Twitter.
He also flags another potential issue: “People could use the self-reporting feature on their phones to keep individuals trapped in abusive relationships under lockdown, exacerbating domestic violence and abuse.”
Sign up to Emerging Threats, our weekly cyber security newsletter
In terms of data retention, the DPIA states there is a risk that without a policy, data could be retained for longer than necessary and breach GDPR. Addressing this risk, the document’s risk section says that personal data can’t be held longer than is necessary, but says that an appropriate time limit with regards to Covid-19 hasn’t been set yet, because of uncertainty about the duration of the crisis.
Worryingly, it also says that while it will ensure that “information processed within the NHS COVID-19 App cannot be identified”, there may be requests to process data from the app for research purposes which “may be linked with identifiable data” (emphasis added). It says that such requests will be subject to further oversight and approval.
The DPIA deems malicious attacks on the backend architecture and the re-identification of individuals through linking with other data as low risk due to technical security protections incorporated into the architecture.
Strangely, “lower than expected public trust at launch” is rated as a low yet possible privacy risk, due to either “inaccurate reports” about the app or the high profile debate about the various merits of a centralised compared to decentralised app. In response, an extensive comms plan is proposed, promoting the ethics council, source code, DPIA and privacy notice for the app.
Veale suggests that some risks have been skipped entirely, including the risk of people identifying those who report positive symptoms of coronavirus – something that can be done (albeit fairly arduously) in both centralised and decentralised versions of the app. “It is quite strange that the UK’s ‘solution’ to this risk is to tell the Secretary of State about it,” writes Veale.
He asserts that several of the risks highlighted in both the DPIA and his own analysis should be assigned at least amber or red ratings. “If not… what exactly is a red risk? Is it a fiction? Do red risks exist?” he writes. A ‘red’ categorisation would merely mean that NHSX would have to speak to the regulator under Article 36 – not that the system couldn’t be lawful.
The Joint Committee on Human Rights has called for new legislation to be drawn up around the governance of the app, but if this sunny impact assessment is to be taken at face value there will be absolutely no need for that.