show image

Bernie Hogan

Senior research fellow, Oxford Internet Institute

Was Facebook’s decision to cut off third-party developers genuinely virtuous?

For many of us who didn’t grow up with Facebook, we remember that magical moment when the site first recommended a friend we hadn’t spoken to in years. It was all people talked about when joining the site. Facebook achieved its market dominance at a time when the world was digital but not quite connected. Fifteen years ago, we still didn’t use real names for personal matters online and certainly wouldn’t get into a stranger’s car that we hailed from the internet (Uber) to go on a date with another stranger met on the internet (Tinder). We had the web, but not the social utility to connect the real people behind so much of this connectivity.

Enter Facebook. What is Facebook exactly? A media company, a data warehouse, or as the site itself used to say on its front page — a social utility? Indeed, it started out as a new sort of phone book for the internet age. Its homepage in 2005 said “Facebook is an online directory that connects people through social networks at colleges”. A year after that Facebook’s home page states “Facebook is a social utility that connects you with the people around you“. Facebook was value-added for one’s existing social life.

In the early days not only did Facebook call itself a social utility, but it seemed like it would be one. In 2004, it wasn’t clear exactly what was going to happen when Zuckerberg and cofounders stumbled upon the winning formula for collecting and representing people and their social connections. Yet, over time Facebook demonstrated that it had the edge over local competitors. Facebook had an austere, consistent, ‘internet blue’ aesthetic. Facebook pioneered the algorithmically-curated newsfeed. Facebook allowed apps to connect to it, making identity management easier on the web.

Nowadays, the word ‘utility’ has been all but scrubbed from Facebook’s press material. It’s not because Facebook stopped being a utility, but because utilities get regulated in America in ways that media companies don’t. And Facebook does not want the same fate as those big regulated media monopolies like Bell in the 80s and AOL at the turn of the millennium. In 2001 the FCC made a regulatory decision that AOL Instant Messenger had to be interoperable with other messaging programs because of its market dominance. You could use your AOL account to speak with someone with Apple Talk, for example. Without this decision, we might all be using an outdated, ‘too-big-to-fail’ AOL instead of Facebook.

If we think of Facebook as a utility and one which may have a near monopoly position on being the place where people use their ‘authentic identity’ with friends (Facebook’s words, definitely not mine), then regulation becomes critical. And what should be regulated is not the ‘privacy’ of information, but its distribution and its use. We aren’t trying to hide things from our friends by posting them on Facebook; we are trying to make use of our content to forge and maintain connections with friends and family.

For a while it seemed Facebook thought it couldn’t do it alone. In 2006 Facebook opened up their API so third parties could innovate over this data in ways that Facebook would neither imagine nor have the resources to do. I was one of those third parties. I visualized people’s social networks with them and learned whether this made the site more useful (it does).

More recently, Facebook learned that doing it alone gave them more control, less risk, and more profit. So for them, closing the means for legitimate third-party data access was another winning decision that upstarts could not afford. Bear in mind, Twitter still allows such access. LinkedIn does too, but you must pay considerably for it.

Recall that it wasn’t just Cambridge Analytica harvesting user data that was the issue for users or lawmakers. It’s that data was then fed back into a broken social utility to allegedly manipulate people psychologically using the very features (events, newsfeed ads, groups) that are now the centrepieces for all the social data that Facebook houses. Cambridge Analytica was using Facebook the way it was intended (read: designed).

By focusing on the work of Cambridge Analytica, we are actually doing the opposite of empowering users. This is the social media equivalent of Naomi Klein’s idea of the ‘shock doctrine’. Klein describes how extreme events enable policymakers to ram through what are otherwise unpopular policies. Facebook has now trumpeted its decision to cut off third party developers in 2015 as a virtuous one because some bad actors might have used that data in the wrong way (which ironically appears to be through their use of deceptive Facebook ads).

But Facebook has yet to show us how to use this social data in the right way. Instead of letting us collaboratively edit our data, instead of allowing good actors the ability to use the social graph in creative ways, instead of fostering data portability, instead of being the host for an ecosystem of interesting friendship-powered ideas, Facebook is clamping down. It is privatising your social network so that it can better target advertisements to you.

It seems then that Facebook has failed us by leveraging the work of third-party innovations during their growth phase, as well as the optimism of users seeking to connect to their own social network. Executives took advantage of these innovations and this optimism under the auspices that Facebook was just a social utility. Then in the 2010s, they privatised this utility by locking down access while learning how best to advertise an endless amount of psychologically-tuned and demographically-targeted content. They effectively turned a social utility into little more than a marketing and media company with a messaging service, while framing it as a community. Now, you and your network get what Facebook wants you to get. And if you want to stay connected to your social network, you better 👍Like It.

Bernie Hogan is a senior research fellow at the Oxford Internet Institute. He has published peer-reviewed articles on Facebook, used data collected from the site and given talks at Facebook HQ in Menlo Park. Lately, however, he is working primarily on Network Canvas, a tool for self-reported network data collection in sensitive and privacy aware contexts such as the reporting of risky sexual contacts and drug use.