Andreas Rentz/Getty Images
show image

Laurie Clarke

Reporter

How did 5G conspiracy theorists become arsonists?

Like mobile phones, 3G and 4G before it, 5G has become a nexus of paranoid, brain-frying health scare delusions. Since at least 2018, fanciful claims have circulated that the mobile technology causes flocks of birds to plummet to Earth; represents a vast mind control experiment linked up through crackling radio waves; and causes a baffling array of health problems including Alzheimer’s, cancer and autism. Coronavirus too has triggered a cascade of bogus half-truths and malicious lies, but the intensifying global pandemic has birthed a new conspiracy crossover genre blending 5G and Covid-19 fears. Spurred on by these wild claims, Brits have begun staging arson attacks on 5G phone masts. 

It’s hard to know exactly when the fluid, fast-moving theories, dragging along a multitude of conspiratorial flotsam, became specific, militant calls for action. The first time the conspiracy spheres of 5G and Covid-19 were hooked together was by a doctor in Belgium, Kris Van Kerckhoven, who speculated on the proposed link in a Belgian newspaper published on 22 January. 

Since then, the theories have multiplied and mutated, but fact-checking charity Full Fact has distilled them into two main strands. One suggests 5G suppresses the immune system; the other claims the virus is using the network’s radio waves to accelerate the viral spread. More recent (and even more outlandish) theories suggest that the radio waves can spontaneously drag oxygen from the lungs and boil blood in the veins. All are unfounded.  

But that hasn’t stopped them sparking the imagination of millions online. Facebook has removed some of the groups calling for arson attacks on 5G masts in the UK – including the subtly named 5G TOWER FIRE COMP – at the behest of users. However, one of the biggest anti-5G UK groups remains, and has added over 20,000 to its ranks over the past week according to analysis by First Draft News, a project that fights online disinformation. The group features posts identifying 5G masts with location data, under which users comment “you know what to do” and post a frenzied series of fire emojis. An even more militant strain of the group is suffixed with #ItsWar. This militaristic framing has been adopted widely, with other local group names proclaiming areas “under attack” from 5G technology. 

A widely shared video features a man with a broad Liverpudlian accent going to inspect a 5G mast and commenting on his intention to burn it down, before cutting to footage of later that evening taken at close range to the flaming tower. Another popular video features a woman interrogating some engineers installing 5G, where she asks why they’re complicit in “murdering people”.  

“A lot of the admins of the UK Facebook groups are the same,” says Alastair Reid, deputy editor of First Draft News. “You see the same faces popping up, and some of them are connected to other individuals and organisations in Europe.” One of these figures is John Kuhles, a Dutch ‘UFO researcher’ and alternative news figure who runs a number of websites and Facebook pages, including the UK’s biggest anti-5G page. 

But although some people are clearly attempting to influence the public debate, it’s trickier to say whether there are paid-for, inauthentic sock puppet campaigns are getting in on the action. This is because social media platforms tend to keep quiet about these, carry out their own investigations and publicise the results later. 

Twitter researcher Marc Owen Jones, downloaded around 22,000 Twitter interactions from 1-4 April and searched for anyone mentioning the terms “5G” and “Corona”. He found the most interactions around Wiz Khalifa (one of a number of celebrities who have taken it upon themselves to spread these conspiracies) tweeting “Corona? 5g? Or both?” Another popular interaction was from the account @deepstateexpose with over 300,000 followers that preaches that 5G is the cause of coronavirus. He found that right-wing, pro-Trump accounts, for example featuring the words MAGA and KAG in the bio, made up a large proportion of those pushing these types of theories. 

Research group EU vs Disinfo has also found evidence that mainstream, Russian state-funded, pro-Kremlin channels such as Sputnik and TV Zvezda have been involved in spreading a number of coronavirus myths (that the culprit is China, or perhaps the UK; that Bill Gates or George Soros are involved). A separate New York Times report from last year warned that Russian disinformation campaigns were exploiting 5G health fears, likely in efforts to stall its roll-out in order for the country to catch up with the mobile tech. 

Should we be surprised that online conspiracy theories have provoked real-life criminal action in the UK? “With conspiracy theories in general, they seem to bloom in a time of crisis – with political change, or a terrorist attack,” says Daniel Jolley, a social psychologist at Northumbria University. “They really come to the forefront when we feel uncertain and we feel threatened.” Conspiracies also conform to a worldview confirmation bias – if you believe that 5G is dangerous and will cause deaths, and suddenly deaths start occurring, it’s tempting to ascribe this outcome to your already-held beliefs. 

“We know that conspiracy theorising has been related to hostility and the desire to protect oneself and one’s group,” says Jolley. “Those who believe in conspiracies are more accepting of violence against those who are perceived to be conspiring.” Of course this doesn’t mean everyone who believes in the 5G conspiracy would go and set a phone mast alight. “Maybe you’re already predisposed to aggression; maybe you are high in authoritarianism, and that may make you more susceptible to want to go and take action,” says Jolley. Terrorist incidents have been provoked by conspiracy beliefs in recent years, including the man who fired a military-style assault rifle inside a popular Washington pizzeria in December 2016, in the belief he was saving children trapped in a sex-slave ring based upon the “pizzagate” conspiracy theory. 

Conspiracy theories that spread on public channels are visible to researchers, but less easy to track is the spread of misinformation across private WhatApp groups. “It’s like a black box container – it’s hard to be able to accurately assess and see what kind of activity is happening there,” says Reid. Slough MP Tanmanjeet Singh Dhesi tweeted that he’d “received countless conspiracy theory WhatsApp messages linking 5G to coronavirus”. While a great number of tweets on the platform bemoan the fact that older relatives, especially mums, dads and aunties, are being hoodwinked into believing 5G conspiracies about coronavirus by viral messages and voice notes shared on WhatsApp. “My dad just sent me a video on WhatsApp that the cause of Coronavirus is 5G…God please give me patience,” reads one tweet. AFP FactCheck, the fact-checking arm of Agence France Presse, has identified and debunked 140 different myths circulating on WhatsApp.

“Messenger apps have played a role in propagating misinformation for some time now,” says Lisa-Maria Neudert, research assistant at the Oxford Internet Institute. “In a Messenger group chat, you often know the other group members which frequently comes with trust. You may not trust the conspiracy theory a stranger shared, but if your dear friend from school shared a bogus story, you might be more inclined to read and believe it.” Explaining the auntie effect, research indicates that older people can be more susceptible to misinformation spread online, because a radical shift in how news is broadcast has left them with a lack of online literacy skills. 

WhatsApp has taken some steps to address this. In 2018 it began tagging forwarded messages as ‘forwarded”, which it claims has reduced the number of forwarded messages by 25 per cent. It also capped the maximum group size at 256 members and, in efforts to quell Covid-19 disinformation, has introduced in new restrictions. Now, if a user receives a frequently forwarded message (one that has been forwarded more than five times), they can only forward it on to one chat group at a time. The previous limit was five, and was introduced in 2019. In an article for the Columbia Journalism Review, researchers proposed more ways for WhatsApp to control the spread of fake posts without breaking encryption, including temporarily restricting virality features for users suspected of spreading misinformation and content on their platform. The researchers hypothesised that despite not being able to see the content of messages, WhatsApp may be able to trace the source of viral messages and flag them as suspicious through other data it tracks. 

In light of the recent arson attacks on 5G masts, the UK government has said that it will be working with social media companies over the next week to discuss how to prevent the spread of this kind of misinformation, potentially putting pressure on these platforms’ commitment to free speech. YouTube has already taken steps to stop recommending this content to users, a tactic it claims decreases viewership by 75 per cent. In a statement, Facebook said: “Under our existing policies against harmful misinformation, we are starting to remove false claims which link COVID-19 to 5G technology and could lead to physical harm.” Twitter said: “We will continue to take action on accounts that violate our rules, including content in relation to unverifiable claims which incite social unrest, widespread panic or large-scale disorder.”  

But the ways these platforms reward more emotive and sensationalist content means it will be hard to curb the spread of clickable yet entirely false content. “Unless that is fundamentally addressed, it’s always going to be inherent to social media,” says Reid. 

There’s other factors to consider in how misinformation gains a foothold. Jolley points to a study that found that the conspiratorially minded are likely to shun health products endorsed by the government, but will endorse the same product if it’s approved by someone considered an underdog. “The source really matters and if it’s a similar-minded, trusted messenger, or somebody perceived to be low in power, that could really change the same piece of information,” he says.  

This indicates the difficulty of addressing this issue by solely tackling the abundance of misinformation out there. Of course, people shouldn’t be able to stumble down fake news rabbit holes so easily, but for the conspiracy-minded or converts of a particular theory, the clamp-down on information can just serve as a confirmation that they’re onto something. “If there wasn’t a connection, then why would they want to hide it?” reads a typical post I came across.

This article was updated on 8 April to reflect WhatsApp’s new restrictions on forwarded messages in response to Covid-19 disinformation.