The EU has unveiled plans to ramp up the pressure on social media sites to crack down on hate speech.
Commissioners have drawn up a list of guidelines urging firms to use automation to prevent the reposting of removed content, and take down flagged content more quickly. They plan to assess firms’ attempts to combat illegal content up until May next year.
In a statement, the EU said it would consider whether “legislative measures” are needed to ensure the “swift and proactive detection and removal of illegal content online”. Such measures would likely entail the introduction of significant fines.
“The situation is not sustainable: in more than 28 per cent of cases, it takes more than one week for online platforms to take down illegal content,” said Mariya Gabriel, commissioner for the digital economy and society.
“Today we provide a clear signal to platforms to act more responsibly. This is key for citizens and the development of platforms.”
Recent research by the European Commission found that social media firms were falling to honour their commitments to remove hate speech within a day. Just 40 per cent of offending content was removed by the deadline.
The threat of legislative action comes after British, French and Italian leaders announced at the UN a call for firms to remove illegal content from their platforms within two hours.
The deadlines are much stricter than those passed by German politicians earlier this year. From this month, obviously criminal content must be removed within 24 hours, while material that is not clearly unlawful must be assessed within seven days.
Repeated calls for firms to automatically remove content has drawn the ire of free speech campaigners and politicians. Jim Killock, executive director of Open Rights Group, said firms had a responsibility to take down illegal content, but that there were limitations to automating removals.
“Given the global reach of these companies, automated takedowns will have a wide-reaching effect on the content we see, although not necessarily on the spread of extremist ideas as terrorists will switch to using other platforms.
“Mistakes will inevitably be made – by removing the wrong content and by missing extremist material,” he said. “Given the global reach of these companies, automated takedowns will have a wide-reaching effect on the content we see, although not necessarily on the spread of extremist ideas as terrorists will switch to using other platforms.”
He added: “There needs to be transparency about where takedown requests come from and what is being taken down automatically; there also need to be clear routes for appeal when mistakes are made.”