The European Commission is drawing up plans to fine social media companies that fail to remove extremist propaganda from their platforms.
The proposals, first reported by the Financial Times, could mandate firms such as Facebook, Twitter and YouTube to remove content within an hour of being alerted by law enforcement.
It marks a shift in direction for the Commission. In March, Brussels introduced a voluntary system in which social media companies were asked to remove terrorist content within 60 minutes. But a spokesperson for the Commission told NS Tech that while the results have “been positive”, progress is “not sufficient”.
“We outlined a clear set of recommendations in March on illegal content and have been carefully assessing the progress made since,” the spokesperson added. “The Commission intends to present a legislative proposal this autumn to step up the fight against terrorist content online.”
The question of how to manage the proliferation of terrorist content is a matter of ongoing political debate. In Germany, tech companies face €50m fines if they persistently fail to take down illegal content. Speaking to the New Statesman earlier this month, the chair of parliament’s fake news inquiry, Damian Collins, backed the introduction of similar measures in the UK.
But some experts have cast doubt on the necessity of fines. In an interview with Radio 4’s Today Programme last year, Max Hill, the UK’s terror watchdog, said: “I’ve sat with the relevant police unit when they identify extreme content. I’ve seen them communicating with tech companies and I’ve seen the cooperation that flows from that. It’s a question of the bulk of the material rather than a lack of cooperation in dealing with it.”
Some critics of the German penalties also claim they fail to strike the right balance between protecting freedom of expression and ensuring firms take their legal responsibilities seriously. Ed Johnson-Williams, a campaigner for the Open Rights Group, told NS Tech last year that the deadlines and fines could pressure companies into removing content “when its legality is unclear”. “Meanwhile there are no incentives to ensure the companies properly analyse the context of content and leave legal content online,” he warned.
Once the proposals have been presented this autumn, the European Parliament, as well as representatives for individual nations, will vote on whether to pass the new rules into law. If they come into effect, they would represent just the latest in a series of measures taken by the EU to tame big tech.
Facebook removed 1.9 million pieces of ISIS and Al-Qaeda content in the first quarter of the year. Twitter and Google, YouTube’s parent company, have been contacted for comment.