The NHS privacy notice for the Covid-19 test and trace programme says that data it gathers – including names, addresses, dates of birth and symptoms of individuals and those they have been in contact with – will be stored for 20 years, available to anyone with a role in the Covid-19 response, and not necessarily subject to requests for deletion.
Legal experts have raised concerns over the loosely defined part of the notice referring to who may process the data, which will apparently be “seen by those who have a specific and legitimate role in the response”. “That vaguery over ‘the specific and legitimate role’ is a real concern, because the NHS has made clear they’re working with third party companies like Google and Palantir on their data store – so do those companies have a specific and legitimate role?” asks Ravi Naik, human rights lawyer and co-founder of the data rights agency AWO.
Under new legislation introduced at the beginning of the pandemic, NHS bodies are at liberty to share sensitive patient data with any third parties deemed necessary to respond to the crisis, which may apply to the test and trace dataset too.
The notice says that the purpose of the data is to control the spread of the virus, meaning the NHS probably wouldn’t be able to use it as a proxy for something like immunity passports. “But that’s not to say that some third party couldn’t take that data and create the immunity passports,” points out Naik.
[Join NS Tech and some of the UK’s leading government tech experts on 3 June for a live discussion on how technology can be deployed responsibly to aid the response to the crisis. Register here.]
Experts baulked at the fact the data will be retained for 20 years, despite the fact that GDPR contains a stipulation that data shouldn’t be retained any longer than necessary. The notice reads that “the information needs to be kept for this long as may be needed to help control the spread of coronavirus, both currently and possibly in the future”. “Show the legal basis for that please,” tweeted privacy expert Pat Walshe.
There have been circulating privacy concerns over the NHSX contact tracing app due to its centralised design that means data is stored in a database run by the government. There are concerns are over whether identifiable data could be repurposed for other uses, such as law enforcement activities, as well as that data could be sold on to third parties, such as US healthcare conglomerates.
These concerns about the app have been met with assurances that the data will be pseudonymised – the Data Protection Impact Assessment for the app even made repeated references to the data being “anonymous” and claiming that the app “preserves anonymity”. A blogpost about the app’s security written by Ian Levy, director of NCSC, an arm of GCHQ, repeatedly uses the work “anonymous” (albeit defined in a “security sense” rather than by the legal definition) to describe data that is collected and processed by the app.
It appears that the app will be integrated into the much wider manual test and trace programme, with the former TalkTalk CEO Dido Harding, who is leading the programme, recently calling the app the “cherry on top” of the wider regime. Public Health England maintains that the two data sets will be kept separate, but if the app is intimately linked to – and a structurally integral piece of – the test and trace programme, it’s unclear how data obtained through the app and information obtained through the track and trace programme won’t functionally be combined.
If it’s all being put to the same purpose, how can the (always dubious) claims of pseudonymity shielding one subset of the data be reconciled with the personally identifiable data retained by the wider test and trace programme? “It’s noticeable for us that the data protection impact assessment, which was published for the app, has not been published for test and trace, and there are similar data protection concerns,” says Naik.
Experts have also quibbled over the test and trace privacy notice’s reference to “personal identifiable information” (PII), a term which does not appear in either GDPR or the UK’s Data Protection Act, but is a term more commonly used under US law. This is likely to fan the flames of worries about the degradation of data laws following Brexit and further privatisation of the NHS.
“My concern is that it’s a misunderstanding of what pseudonymous and anonymous data is,” says Naik, mentioning that “the DPIA […] for the app had an over reliance on ‘anonymity’, when actually that data is probably not anonymous”. “Here […] there’s no definition or use of ‘anonymity’ or ‘pseudonymous data’, it just uses this phrase ‘personal identifiable information’.”
This article has been updated to reflect comment from Public Health England.