This general election has described by some commentators as ‘the most digital election, ever’. It’s certainly one of the weirdest. So far this election the public has had to deal with BorisWave, dead squirrels, fake fact checkers and astroturf groups pretending to be Greta Thunberg. Open Rights Group (ORG) has seen the unprecedented (we think) use of personal data by the main political parties, in their quest to maximise vote share and minimise spending inefficiency. It is not a question of who is better or worse; on this issue they are all drinking the kool aid.
Some have described the rapid adoption of new campaigning techniques as an arms race. To me, it is more like watching a conga line; eventually, all political parties copy the one that appears to be ahead of the curve, moving through methodologies faster and faster, without stopping to look where they are going. The political parties are drunk on data. Eventually, someone will seriously misstep, and trust in our elections will come crashing down. A lot of it, by the way, might amount to unlawful activity – particularly where personal data is concerned (PDF).
Open Rights Group thought it might be useful to ask some members of the public what they thought about these new trends in campaigning. We commissioned a poll of 28 marginal constituencies to see what they thought about three digitally-enabled campaigning practices; the use of targeted adverts based on analysis of people’s personal data (‘political micro-targeting’), online ads that are only seen by the recipient (‘dark ads’) and the spending of money in election campaigns without declaring where the money originated from (‘dark money’). We also asked them about their support for various policy remedies. We picked marginal constituencies because we thought that they were the most likely to be at the receiving end of these activities, and therefore more likely to have an informed opinion.
The results were stark. A majority of those we polled were aware of political micro-targeting (63 per cent). This level of awareness extended to 75 per cent amongst young people. By contrast, there was fairly little public awareness of dark ads (47 per cent), or dark money (48 per cent). This suggests that some sections of the public are slowly waking up to some of the issues raised by the Cambridge Analytica scandal, although others remain in the dark.
We then asked people whether they thought these activities should be allowed during election campaigns or not. While a small majority of those polled thought that political micro-targeting and dark ads shouldn’t be allowed during elections (58 per cent) and (54 per cent) respectively), a more substantial majority (74 per cent) thought that dark money should not be allowed. One conclusion for why this might be is that dodgy donors and tax avoidance are more familiar concepts for older voters than data use and abuse, and it’s therefore more likely that they would have a strong opinion. This was reflected in the demographic breakdown of results; the older you were, the more likely you were to oppose it.
What was particularly striking was the policy remedy that people most supported to address these issues. We proposed a number of policies; new regulation, more transparency, better ways to verify information, tougher punishments (inclusive of criminal convictions and increased personal liability), and the ever popular ‘digital literacy’. All policies had the support of over 70 per cent of those polled. The single most popular policy, however was tougher punishments (82 per cent), which had a strong level of support across demographics. Transparency was very slightly less popular (81 per cent), although the difference between these two results is within the margin of error.
You can draw some striking conclusions from this last round of questions. The high level of support across the board for a range of policy interventions suggests that the public are expecting reform in this area. The fact that tougher punishments and transparency are among the most popular, more so than complex regulation or digital literacy, is also significant. It suggests that people want to be empowered to make critical judgements about political communication for themselves. People value agency, rather than being told what they can and can’t politically engage with. Regulators clearly agree with them – the Electoral Commission has been calling for digital imprints on campaigning materials since 2003. The support for tougher punishments suggests that the public wants regulators to be more empowered under the law.
Data-driven political campaigning has been the canary in the coal mine for a wider discussion about what an online regulator and regulation might look like. A variety of proposals of increasing complexity have been proposed across civil society, government and academia. It has become a talking shop. A quick win, however, might be to give the existing regulators more teeth. The Electoral Commission needs to be able to monitor spending in real time, to share information with other regulators such as the ICO (and vice versa), and to serve meaningful fines on campaigns that break regulations, so that they are not priced in as the ‘cost of doing business’. I imagine that most people would be surprised to learn that the Electoral Commission cannot already do these things. These powers are all low-hanging fruit, effective interventions that could be achieved without more navel gazing by policy makers.
After the election, achieving these goals quickly should be a legislative priority. Strengthening enforcement powers is reasonably simple, uncontroversial, and largely avoids the thorny issue of regulating political speech. Otherwise political campaigners will continue to adopt these campaigning techniques, to the detriment of our fundamental rights. Political parties have been drunk on data for a while now. It’s time to turn the lights on and tell them that the party is over.
Pascal Crowe is data and democracy project officer at the Open Rights Group