Germany has started enforcing a controversial new law that imposes fines upon tech giants which fail to remove illegal content from their platforms.
Sites that do not act on reports about hate speech, fake news or illegal material within 24 hours could be subjected to fines of up to €50m under the new law.
The legislation, dubbed NetzDG, was passed into law in June and came into force in October. Firms were given until the end of the year to comply.
Facebook has been reported to have hired hundreds of staff in Germany to handle reports about content that breaches the new law.
It has already come under fire from human rights campaigners, industry and the UK’s terror watchdog, at a time when Theresa May is considering similar measures for the UK.
Critics claim the penalties fail to strike the right balance between protecting freedom of expression and ensuring firms take their legal responsibilities seriously.
Ed Johnson-Williams, a campaigner for the Open Rights Group, told NS Tech the deadlines and fines could pressure companies into removing content “when its legality is unclear”.
“Meanwhile there are no incentives to ensure the companies properly analyse the context of content and leave legal content online,” he added.
Alexander Rabe, a member of the Eco board, which monitors the German internet and represents the industry, has also expressed concerns about the deadlines.
“It takes time to define if a complaint’s content is really illegal or not,” Rabe told the BBC.
The necessity of financial penalties was called into question by the UK’s terror laws watchdog Max Hill in June after May announced that the UK and France could follow Germany’s lead by introducing fines for social media firms.
“In Germany, there was a proposal for very heavy fines to be levied against tech companies whenever they fail to take down extreme content. Is that absolutely necessary? I’m not sure that it is,” Hill, formerly one of the country’s leading prosecutors of terrorists, told the Today Programme.
“I’ve sat with the relevant police unit when they identify extreme content. I’ve seen them communicating with tech companies and I’ve seen the cooperation that flows from that. It’s a question of the bulk of the material rather than a lack of cooperation in dealing with it.”