The concerns surrounding the UK’s “wobbly”, technologically quirky and legally dubious coronavirus contact-tracing app are manifold. But one that comes up repeatedly is that of privacy, and the creeping state surveillance the app could unwittingly usher in. It’s an issue that’s been raised by academics and privacy experts, by the Joint Committee on Human Rights (composed of a number of high profile MPs), and by a thousand online articles. But in the case of the UK, this fear may be somewhat misplaced. That’s because the horse has bolted; we already live in a surveillance state.
Anxiety over ‘mission creep’, and the possibility that law enforcement and intelligence agencies could someday get their hands on the tracing app’s data were stoked by the revelation that the National Cyber Security Centre, a division of GCHQ, played a role in developing the app. But when GCHQ said it had “no desire” to access the data collected by the app, it might have been telling the truth. “It is quite likely that in terms of technical capacity GCHQ could already acquire much of the data the government might want to acquire [through the app] via technical capacity, e.g. warrants, to acquire GPS data for movements,” says Lilian Edwards, professor of law, innovation and society at Newcastle University.
Objections to app-based surveillance often seem to overlook the fact that the UK is one of the most surveilled and snooped on nations on earth. When exposing the state-sponsored spying of the Five Eyes intelligence alliance (comprising the UK, US, Australia, Canada and New Zealand), Edward Snowden believed that once the truth was out, people would revolt. Instead, the invasive comms-combing powers were written into UK law – now known as the Investigatory Powers Act (IPA) or, more commonly, the Snooper’s Charter.
Under this act, UK intelligence agencies can access bulk communications and personal data, carry out targeted and bulk hacking of devices, and even tell UK-based communication services to remove the encryption from data. Did you know that your internet provider is legally obliged to save your entire browsing history for a year and that, in addition to intelligence and law enforcement, myriad public bodies including the Gambling Commission and the Pensions Regulator can request to view it? London routinely ranks as one of the most surveilled cities in the world, bristling with CCTV cameras and more recently, facial recognition cameras too. “The app is a minor drop in the ocean compared to what already exists,” says Ray Corrigan, a senior lecturer specialising in cybersecurity and digital rights at the Open University.
Despite this, there is good reason – perhaps even more – to push for a solid legal foundation for the tracing app. NHSX, the organisation behind the app, claims that the contact data collected is pseudonymised and that third parties won’t be granted access to it. However, later iterations of the app are intended to request increasingly personal information, like location data. And although the data is pseudonymised, the NHS has the means to link it to an identifiable person. The project’s Data Protection Impact Assessment (DPIA) expressly states that de-anonymised versions of the data might be used for research at a later date.
The threat of data misuse is especially acute for populations that tend to bear the brunt of the most intrusive surveillance and policing powers – “religious communities, black and ethnic minority communities, poor communities,” says Corrigan. The Home Office has been found to exploit NHS data and data from schools in order to persecute immigrants in the UK. There are valid concerns around whether the app could offer another avenue to do so.
“With a lot of surveillance technologies, the purpose is defined in a very broad or very opaque way,” says Pete Fussey, professor of sociology at the University of Essex. He encountered the same issue when carrying out research into facial recognition technologies. “It’s very similar; it’s very broad,” he says. He points out that the legal frameworks used to govern these technologies are out of date. For example, the legal basis for facial recognition is currently common law. “Under that you can just leverage any information you want for any purpose – it’s entirely inadequate as a safeguard,” says Fussey.
This is why a clearly defined legal framework is required. At present, NHSX is hoping to rely solely on data protection regulation to govern the app which Fussey says “isn’t strong enough and doesn’t cover the range of harms”. ‘NHS resource planning’ (incidentally, the task that controversial data-mining company Palantir is supporting) is something the government has repeatedly said that the NHSX app’s data could be used for. Fussey speculates on whether, given the loose framing, that could perhaps include predictive policing – “whether there will be spikes in violence and demand on hospitals on a Friday night”.
An ethics board was recently set up for the app, but it can’t substitute for legal oversight. Fussey makes a comparison with the Investigatory Powers Commissioner’s Office which, although its independence and efficacy can be debated, was set up to authorise surveillance warrants, and provide some oversight of how the system functions. Flaccid suggestions that the ICO or the Biometric Commissioners office could do the same for the tracing app have been made, but nothing is set in stone and the app is set to be rolled out nation-wide as early as next week.
“It’s completely sensible to have a large degree of skepticism over the promises that are made unless they’re independently verified,” says Fussey, pointing out that the app’s source code contradicted NHSX’s repeated promise that no third party trackers would be included in the app. In fact, Google and Microsoft analytics were embedded in the software, something that the DHSC claims will be removed for the final product. There’s scant reassurance too, in denials that the data won’t be passed on to third parties. The online database ‘‘They sold it anyway’ details all of the times that NHS Digital breached contracts by selling patient data.
The coronavirus crisis has been exploited as a way to embolden the security industrial complex all over the world. Spyware companies like NSO Group – currently embroiled in three human rights lawsuits – have sought to reap the benefit of the crisis – gaining legitimacy by rolling out Covid-19 tracking technology. In many countries, the architecture of surveillance has been fortified and security agencies have been tasked with formulating a response.
In the UK, it was announced that Tom Hurd, director general of the government’s Office for Security and Counter-Terrorism, has been tasked with setting up a Joint Biosecurity Centre and assessing the coronavirus risk level. Earlier in the crisis, the former head of MI5, Jonathan Evans, proselytised in the Times about the utility of “digital and data-driven surveillance techniques” in getting things back to normal. The analogy between the pandemic and terrorism is increasingly trotted out, but coronavirus is a public health – not a national security – crisis.
The growth the security industry has even more pernicious implications for the current global healthcare crisis. A number of privacy experts assert that it has directly drained funds from the public purse – that could otherwise have been spent strengthening the public healthcare system to sustain shocks like Covid-19. “From a structural point of view, there has been a long time trade-off between surveillance and public health in the UK,” says Corrigan. “Though a pandemic has been top of the government’s risk register for years, the relative expenditure on preparing for a public health crisis, like Covid-19, has been a pittance compared to anti-terrorist, security and surveillance capability, infrastructure and laws.”
Corrigan says this comes from the assumption that “mass invasion of privacy is the cure for terrorism […] Yet, after at least 20 years of mass surveillance, the problems it purports to solve are worse than ever”. Security expert Bruce Schneier calls this “security theatre” – the appearance of security without any material increase in safety.
The tracing app risks falling into the same category, with many signs that it won’t work as planned. However, all of this isn’t to say a truly privacy preserving, public health geared app with strong legal safeguards isn’t possible. It’s just that the UK government has neither the appetite nor the will to make it happen.