show image

Home Affairs Committee grills social media giants

The Home Affairs Committee met in Parliament yesterday and heard from senior spokespeople from social media including Google (and therefore YouTube, which it owns), Facebook and Twitter.

The idea, as widely reported in today’s papers (for example the Guardian), was to hold the companies to account for what appears on their networks. Chairing the committee, Yvette Fielding pulled no punches at the outset, pointing out that Nazi images and pro-Nazi videos were still on YouTube in spite of the company’s avowed stance against hate content. Her question was economical in the extreme: “Why?”

The full video of the committee meeting is available here and the detail plus followups are all over the press. For the IT professional it raises the question: whether on public social media or on a company’s own network, who is responsible for the content – the poster or the network owner? And if a company is using a third party for its chat functions, perhaps Microsoft with Yammer, or Slack, Convo, or any of the other alternatives, who should actually say what’s acceptable?

Social media and shading

Ironically in the light of the committee’s opening gambit, the problem may not be with pro-Nazi content. This is almost universally reviled, very few people actually want it and there are mechanisms for its removal in place. Equally, although removal depends on people reporting it and it’s not done proactively, you can be certain that someone is working somewhere on means of keeping it out of view.

Network administrators for social media and elsewhere have a more difficult job when one person’s view is that something is extreme and another person disagrees. The rise of Donald Trump (which is becoming a bit of an old chestnut but it’s a valid example) illustrates this well. Calls to restrict the movement of people from Muslim countries might only a couple of years ago have appeared extreme. Now, even if someone regards the policy as unreasonable, there is a legitimate reason to discuss it.

Somewhere someone has to decide what’s reasonable and what isn’t, which is an editorial decision the networks are reluctant to take as they see themselves as an open forum rather than a controlled environment. The problem, increasingly, is how to sustain this during an epidemic of “fake news” (and a US president who uses that term to describe everything he doesn’t like) and heightened tension online.

The answer may eventually be that the Internet can’t be as free and easy as had been originally envisaged. Certainly the tone of yesterday’s debate suggests the UK government, at least, would welcome firmer controls.