In a move designed to create the impression of radical transparency, NHSX finally released the source code for the UK’s Covid-19 tracing app last week. However, among experts, the reception was mixed.
Aral Balkan, software developer at Small Technology Foundation, expressed criticism that the source code for the server hadn’t been released, only that of the iOS and Android clients. “Without that, it’s a meaningless gesture,” he wrote on Twitter. “Without the server source you have no idea what they’re doing with the data.”
In further analysis, Balkan said the iOS app was “littered with Google tracking” and the app was using Firebase (a Google mobile and web application development platform) to build the app. He pointed out that Microsoft analytics also tracked what information the person has provided, such as partial postcode and collected contact events.
Addressing these concerns, a Department of Health and Social Care spokesperson said in a statement: “Our priority is to maximise the public health value of the app to save lives and protect privacy in line with the law – no personal data is being shared with third parties.
“To make sure the app is providing users with the best possible experience, we’ve included industry standard software to help us understand how people interact with the app to allow us to make improvements before it is rolled out nationally.
“The software takes no personal or identifying data from the device and will be removed after the Isle of Wight trial.”
However, Balkan told NS Tech that “regardless of what data (and metadata) is visible to those companies, this is just not a decision that a developer who cares about privacy would make”, describing Google and Microsoft as “surveillance capitalists”.
Analysis by the app data company Reincubate found that the NHSX app seemed to conform to the privacy protocols laid out in a blog post by NSCS technical director Ian Levy. “We can confirm that no key data leaves the user’s device until they report symptoms, and only then do the anonymised keys of devices it has been in close proximity to leave the device. The first half of the user-entered postcode is sent to the API as part of the registration service,” the firm wrote on its website.
It concluded that there was little privacy risk until the user reported symptoms in the app. “However, at such time as multiple users come into contact, report symptoms and arrange tests using the identifier that the app has provided them, it may or may not be possible to centrally tie them together. That’s a back-end question […]” the post continued.
The NHSX tracing app’s Data Protection Impact Assessment (DPIA) was also released last week. This was considered an important element of the legitimisation of the app by privacy experts. At a human rights committee meeting last week, ICO commissioner Elizabeth Denham admitted that the organisation had not yet seen a DPIA.
In an email last week, the ICO pointed out that it was not a legal requirement for the organisation to review the DPIA (which is only required by law when, through an organisation’s DPIA, they identify a high risk that they can’t mitigate). The ICO said that on the request of NHSX, the organisation had agreed to carry out an informal review of the document (not to be confused with ‘approval’.)
The DPIA says that while data processed on the app doesn’t reveal the identities of users, because the app uses a set of unique identifiers that relate to an identifiable person for whom consequences arise from using the app, it treats the data as “pseudonymised data”. It says that although a person cannot be identified from their data, the data qualifies as personal data, and GDPR applies.
In accordance with the law, personal data will not be kept for longer than is necessary in the central database. However, the document reads that “The exact retention period for data that may be processed relating to COVID-19 for public health reasons has yet to be set […]” due to the uncertainty of the duration of the pandemic. It reads that instead, “the necessity to retain the data will be routinely reviewed by an independent authority (at least every 6 months)”.
It also points to the use of the “fully anonymised [data] for public health planning, research or statistical purposes, if those purposes can be achieved without the use of personal data”.
Sign up to Emerging Threats, our weekly cyber security newsletter
Experts were not impressed by the document. Michael Veale, Lecturer in Digital Rights and Regulation at UCL, wrote in an analysis of the DPIA that it “highlights a range of significant issues which leave the app falling short of data protection legislation”. Veale contends in his analysis that the document “consistently misuses” the word “anonymous” or claims that the app preserves anonymity.
“These statements are legally misleading, and contradictory to later admissions in the DPIA,” Veale writes. “The NHSX app does not preserve the anonymity of users, as it primarily processes pseudonymous, not anonymous, personal data. Anonymous information is only that which is not personal data.”
Veale also bristled at the redacted “risk” sections of the document. “The published DPIA is incomplete, as the application and platform risks have been redacted,” he writes. “Whether these have been assessed properly therefore cannot be scrutinised by the public.” He objects on the basis that the report should at the very least include risks from adversaries who aren’t the data controller.
This piece has been updated to clarify that NHSX is responsible for producing the Data Protection Impact Assessment.