The Prime Minister keeps talking about artificial intelligence and big data. Hurrah, you might say. Parliament is finally catching up with the real world.
But as is always the case, the devil is in the detail.
In our Science & Technology Select Committee report, released yesterday, we call on the government to be much bolder in the accountability and transparency of algorithmic decision making. We stress that this bold approach must be especially applied where algorithms are used in the public sector, and where such decisions can have a significant impact on citizens’ rights.
The recent breast cancer screening tragedy in the NHS – where recall letters never reached patients who could have been saved by timely breast cancer screening – was probably a simple algorithm. Assumedly, it was a calculation based on a patient’s last screen and when a notification ought to be sent in time for the next one. In the world of artificial intelligence – where computers make and learn from their own decisions – this breast cancer algorithm is dumb. If people died because of the failure to use a dumb algorithm, the risks for machine learning algorithms is stark. This shouldn’t stop us from modernising our NHS, but it’s an early warning about what’s at risk, and the care that needs to be taken.
In our related report on the use of Genomics in the NHS, the Science and Technology Select Committee also heard evidence of how much work and investment is needed in the NHS to effectively use the data it holds.
Sir John Bell, author of the government’s life sciences industrial strategy, made it clear that NHS data is a global gold mine. No other country in the world has such a comprehensive data set at the size of our population over such a long period a time. But here’s the problem: the data is in a mess. It isn’t in a standard format and it’s held by different organisations in different locations. And anyone who notices the continued use of pagers and fax machines in their local hospital will see how retro our NHS is.
So what’s the answer? The answer from Margot James MP, Digital Minister, and Google DeepMind, is that the private sector is best placed to help. In the DeepMind situation, NHS data was shared with them on the basis that they would pay for their staff to clean it up and make it useable. They would then also develop and own the algorithm.
In my questions to DeepMind, I asked whether they felt they should have paid something to access NHS data. I was told that value was derived from their services in helping the NHS use its data to better help patients. In my subsequent questions to the minister, I received the same answer. My suggestion – that the NHS employs data scientists to clean up data, and data managers to get the best commercial value for access to NHS data – was rejected.
The artificial intelligence drive in the NHS is welcome, but we must recognise that this intelligence comes from the machine learning capabilities of algorithms processing huge amounts of data. Data that belongs to NHS patients. Without their data there is no intelligence. So when patient data in the NHS is handed over to commercial companies, it should be made very clear that we expect to share in any profits made off the back of using an NHS data-derived algorithm anywhere in the world.
And for those of us that take a cautious approach to the role of profit making private companies in the NHS, the potential creep of such businesses providing the cutting edge technologies to deliver NHS services is one we ought to monitor closely.
If the Prime Minister is taking this issue seriously, she should hand data policy functions back to the Government Digital Service in the Cabinet Office, so that it has a cross-departmental remit with appropriate impact. We shouldn’t be making different decisions in different ways in different departments. And – in line with my rejected amendments to the Data Protection Bill – the Government should put the new Centre for Data Ethics and Innovation on a statutory footing in the Cabinet Office too.
And finally, I welcomed news this weekend from Matt Hancock, the Digital and Culture Secretary, that he’s pursuing the idea of age verification of children online, with a view to regulating advertising that is targeted at them. I also called for this during the passing of the Data Protection Bill, but again it was rejected. I really don’t mind if the Conservative Party seeks to claim the credit for this stuff; I just want to make sure that we get it right. I hope the team at Digital, Culture, Media and Sport take a more cross-party approach in the future.
But this is not just about MPs. The select committees, parliament as a whole and especially those outside of parliament have a huge amount of expertise and insight on these issues. I welcome the progress being made by this government, and its laudable ambitions to better regulate the online world and to transform our public services using technology. But I call on them to listen, to act with haste but with appropriate caution, to make sure we get this right for the British people the first-time round.
Darren Jones is the Labour MP for Bristol North West and is a member of the Science & Technology and EU Scrutiny Select Committees. He’s the co-chair of the Parliamentary Information, Communication and Technology Forum and the Parliamentary Commission on Technology Ethics. He tweets @darrenpjones.