show image

Oscar Williams

News editor

Theresa May challenges tech firms to take down terrorist content within two hours

Theresa May is joining forces with the leaders of France and Italy to challenge tech firms to remove terrorist content from their platforms within two hours.

She will tell the UN general assembly today that social networks and search engines must “go further and faster” in tackling online extremism.

“Ultimately it is not just the terrorists themselves who we need to defeat. It is the extremist ideologies that fuel them. It is the ideologies that preach hatred, sow division and undermine our common humanity,” May will say.

Kent Walker, general counsel for Google, is representing the search giant at a meeting of tech firms and politicians hosted by May today. He told the Radio 4 Today Programme this morning that companies would not be able to tackle the problem alone.

“Machine-learning has improved but we are not all the way there yet,” he said. “We need people and we need feedback from trusted government sources and from our users to identify and remove some of the most problematic content out there.”

May has previously threatened to fine social media firms that fail to swiftly remove illegal content, a policy spearheaded by politicians in Germany.

From October, social media companies and search engines in Germany will face fines of up to €50m if they regularly fail to remove illegal content from their platforms.

But the deadlines are more generous than those proposed by May. Obviously criminal content must be removed within 24 hours, while material that is not clearly unlawful must be assessed within seven days.

The UK’s terror watchdog, Max Hill, cast doubt upon the necessity of fines in an interview with the Today Programme in June.

“In Germany, there was a proposal for very heavy fines to be levied against tech companies whenever they fail to take down extreme content. Is that absolutely necessary? I’m not sure that it is,” he said.

Hill noted that social media firms such as Facebook, Twitter and Google, which owns YouTube, are already working closely with police forces to address the issue.

Meanwhile, civil liberties groups have questioned whether the proposals strike the right balance between protecting freedom of expression and ensuring firms take their legal responsibilities seriously

Jim Killock, executive director of Open Rights Group, said firms had a responsibility to take down illegal content, but that there were limitations to automating removals.

“Given the global reach of these companies, automated takedowns will have a wide-reaching effect on the content we see, although not necessarily on the spread of extremist ideas as terrorists will switch to using other platforms.

“Mistakes will inevitably be made – by removing the wrong content and by missing extremist material,” he said. “Given the global reach of these companies, automated takedowns will have a wide-reaching effect on the content we see, although not necessarily on the spread of extremist ideas as terrorists will switch to using other platforms.”

He added: “There needs to be transparency about where takedown requests come from and what is being taken down automatically; there also need to be clear routes for appeal when mistakes are made.”

Ministers are due to make a judgement about how much progress tech companies are making in tackling extremist content at a meeting of G7 leaders on 20 October.