Social media firms should face prosecution if they fail to remove illegal content from their platforms, an influential ethics board has said.
In a report published today, the Committee on Standards in Public Life lends its support to Theresa May’s plans to punish tech giants for hosting extremist and abusive material.
Its chairman, Lord Bew, also urged the Prime Minister to consider introducing a law that would make it an offence to intimidate parliamentary candidates and campaigners.
“We cannot get to a point where people are put off standing, retreat from debate, and even fear for their lives as a result of their engagement in politics,” he said.
But Open Rights Group’s executive director, Jim Killock, warned that the measures could turn Facebook CEO Mark Zuckerberg into a “national censor”.
“Facebook and Twitter will censor legal material because they are scared of fines. They are the worst people to judge right and wrong. Theresa May is in danger of removing people’s right to a proper legal decision.”
Under EU legislation, social media firms are not liable for the content uploaded to their platforms because they are defined as “hosts” rather than publishers.
But the committee urged May to use Britain’s departure from the EU as an opportunity to shift the liability of illegal content on to the tech giants.
Responding to the recommendations, Twitter’s policy chief Nick Pickles said “abuse and harassment have no place” on the platform, and that it is now taking action on 10 times the number of abusive accounts compared to 12 months ago.
Facebook said it was “making significant investments in hiring more people who understand the issues around candidate safety and continue to work with organisations such as The Fawcett Society and Reclaim The Internet to improve how we tackle this kind of abuse”.
Google did not immediately respond to a request for comment.
In June, Max Hill QC, the UK’s terror watchdog, criticised plans to fine tech firms that fail to remove extremist content from their platforms.
“In Germany, there was a proposal for very heavy fines to be levied against tech companies whenever they fail to take down extreme content,” he told the BBC. “Is that absolutely necessary? I’m not sure that it is.”
He added that social media firms are already working with police forces to address the issue: “I’ve sat with the relevant police unit when they identify extreme content. I’ve seen them communicating with tech companies and I’ve seen the cooperation that flows from that. It’s a question of the bulk of the material rather than a lack of cooperation in dealing with it.”
In October, a new law came into force in Germany that introduced fines of up to €50m for social media companies that regularly fail to swiftly remove illegal content.