Dan Kitwood/Getty Images
show image

Jim Killock

Executive director, Open Rights Group

What could possibly go wrong with Theresa May’s plans to start punishing social media firms?

Theresa May has found new support for her Digital Charter plans to punish social media companies when they fail to remove abusive content. The Committee on Standards in Public Life is today explaining to Parliament how and why these companies should “do more” under the threat of sanctions.

Sounds fair enough – what could possibly go wrong?

Quite a lot actually. Let’s remember, the core complaint here is that these private companies don’t want to invest time and money finding and removing bullying remarks, hate speech and extremism.

The contention is that if the government threatens them with fines for failing to remove content within say 24 hours, then the companies will do more to remove it.

I think that’s pretty clearly true. If the company faces a fine, then every complaint will be dealt with, so that the fines are avoided. The problem is that platforms will do it on the cheap, so that they also don’t have to spend millions of pounds getting people to check.

Companies apply rules and criteria, rather than make careful judgements, when removing material. This is so that human reviewers can make decisions in seconds, rather than minutes. Rules can be automated, potentially, so computers can maybe do some of this work.

Machines can be asked to look for patterns. So perhaps people from a certain part of town make dodgy posts; maybe this “helps” get posts removed. Machines correlate, but correlation is pretty much the same as a prejudice, when the result is a decision about an individual.

Machines in any case are a long way from understanding context. If a comedian says “I love ISIS” this is different from an extremist saying the same thing. A news outlet showing an ISIS video clip is different from the original video being posted by a propagandist. A machine can’t easily tell the difference, except by evaluating by correlation: in other words, by applying a prejudice.

Perhaps the worst aspect is that companies’ terms and conditions are much tighter than the law. If failure to enforce a platforms’ terms and conditions is the standard on which it is punished, we really are in immense trouble, yet that seems to be the intention. Should it really be the case that Facebook can be fined if a breast feeding picture including nipple isn’t removed within 24 hours?

In any case: everyone deserves due process. It’s vital that people who want to contest a decision can get to a court. There’s no sign this has been understood yet. Without courts as a backstop, we really will be handing justice over to corporations.