The Covid-19 pandemic is causing untold misery across the world. Healthcare professionals and other key workers are doing their utmost to help people through this uniquely difficult time. Their efforts are however being undermined by an increasing spread of harmful misinformation online. Conspiracy theories regarding the origin of the virus alongside false cures are encouraging people to damage their own health and endanger those around them.
This misinformation flows through the same networks that spread lies during elections and undermine the public’s faith in democracy. The roots of this crisis started long before the current virus and result from a decade of wilful inaction. Facebook, Google and others have allowed these counterfeit truths to flourish on their platforms. At the same time, the government has failed to get to grips with the pace of technological change and the transformative challenges and opportunities it presents.
The gavotte played out between successive governments and the technology industry has gone on far too long and its effect is beginning to undermine all of us.
Technology is not a force of nature and online platforms are not inherently ungovernable. They can and should be bound by the restraints that we apply to the rest of society. The government’s proposals for a ‘duty of care’ to tackle online harms offers a positive way forward but the the Department for Digital, Culture, Media and Sport (DCMS) has taken so long to add detail to these proposals that the seriousness of this government’s continued commitment has to be questioned. The promised draft bill should be introduced now. On the current timescale we could be waiting until 2024. This is clearly not good enough. Today, the Committee on Democracy and Digital Technologies, which I chaired, has published its report including our proposals for what the bill should include.
A first step would be to hold platforms accountable for the content their algorithms promote. The sheer scale of any platform is no excuse for its failure to act against content spreading rapidly through the network or via creators with particularly large audiences. Platforms such as YouTube pay out significant sums of money to those with large followings, some of whom publish hate speech and promote conspiracy theories.
These companies argue that they have no control over the content their systems recommend, but that simply isn’t true. Platforms make any number of algorithmic design decisions without adequately testing their effects. Our committee heard evidence of Google allegedly suggesting to its users that Muslims do not pay council tax (PDF), along with credible accusations of it demonetising videos produced by LGBT+ users on YouTube.
Platforms should be obliged to audit the decisions they make when altering their algorithms to ensure that they are not harmful to particularly vulnerable users.
In attempting to head off regulation, platforms describe this indifference to harm as the best way to protect people’s freedom of expression online. However, this fails to reflect the reality that these corporations prioritise cost reduction over enforcing even their own rules.
We heard descriptions of posts being taken down arbitrarily, and perceptions of platform bias. If they are serious about protecting free expression online platforms must be far more transparent about what content they do take down – and why. There is clearly a growing need for a ‘digital ombudsman’ to whom individuals can appeal if they believe platforms have acted unjustly towards them.
Citizens should also be empowered to assert their rights online. Digital media literacy must be firmly embedded into the curriculum to ensure that, from a young age, children are taught about how platforms work, and are equipped to become active digital citizens. The system at present is rather like showing a child how to ride a bicycle but forgetting to teach it the highway code. There must be requirements on companies to design platforms that increase, rather than further confuse, user understanding and choice.
Our select committee, of peers of all parties and none, is unanimous that the government should immediately push through legislation, so that digital technology supports our democracy instead of undermining it. Doing so would help restore public trust at a time when all our energies should be devoted to tackling an unprecedented range of complex challenges.
David Puttnam is chair of the House of Lords Committee on Democracy and Digital Technologies