The A-level results algorithm that downgraded almost 40 per cent of students’ predicted grades could be challenged in court, say a number of experts.
Data law experts have suggested that students might be able to challenge the algorithm under GDPR – which stipulates under Article 22 that individuals have the right not to be subject to a wholly automated decision that has a significant impact on their lives.
Ofqual has attempted to refute that the A-level results were calculated solely by automated decision-making. In a “privacy impact statement“, Ofqual writes that because teachers supplied students’ predicted grades and rankings, the process involved human elements as well as automated elements.
However, experts have refuted Ofqual’s assessment. “This is an absurd interpretation of [Article 22],” tweeted associate professor of Human Centred Computing at Oxford University, Reuben Binns. “The fact that teachers determine rank order is not sufficient, because rank order is an input to the algorithm, but the algo makes the final decision. Reviewing centres is not same as reviewing individual student grades.”
Ofqual’s privacy impact statement says “some have referred to this as an algorithm or a mathematical procedure, but no artificial intelligence is used”, asserting instead that the model is a “critical tool to enable standards in qualifications to be maintained and for fairness between centres […] to be achieved”.
The automated element was the standardisation model that generated students’ grades based on a number of inputs including the centre’s historical grade distribution, the past attainment of other students sitting the same subject at the centre, the prior attainment of students entering the subject this year through the centre compared to previous years, and “the national value-added relationship mapping the historical relationship between candidates’ prior attainment and their final grade in the subject”.
As another example of the human elements involved, Ofqual notes that exam board officers checked the outcome of the model for individual subjects, and could have chosen to reject them in the event that they appeared anomalous.
Jon Baines, data protection advisor at UK law firm Mishcon de Reya, writes in a blogpost that, “The “process of standardisation” is, it seems, an algorithmic one, whereby a decision is reached by automated processing”. In response to Ofqual’s privacy impact assessment, Baines writes, “Whether this means that the Ofqual standardisation model did not involve “solely” automated decision making will no doubt be determined in the various legal challenges which are apparently currently being mounted.”
Another way of legally challenging the algorithm could be on the grounds that previous pupils’ scores were used to predict those for the 2020 cohort.
The Norwegian Data Protection Authority has signalled an intention to order the International Baccalaureate Organisation (IB) to change its approach to scoring on the basis that using historical data to calculate students’ grades isn’t fair. The DPA says that the grades don’t reflect students’ individual achievement, and instead aim to predict what the pupils might have received had they sat the exam – something that’s impossible to do.
A follow-up blogpost from Mishcon de Reya highlights that this could be a fruitful way to challenge the UK’s algorithm too. It highlights a 1993 legal case in the UK that ruled that credit score company CCN (now Experian) must no longer use historical data about the credit ratings of people who lived in a particular address to calculate the credit score of someone currently living at the address. The judge effectively ruled that “the principle of ‘fairness of processing’ should generally mean that third party data could no longer be used to make decisions about [the person in question]”.
The blogpost argues that the A-level results algorithm conflicts with this ruling, because it partly relied on schools’ historical performance data to calculate the present cohort’s results, meaning that pupils attending an underperforming school were hamstrung by the performance of previous students. “That’s simply not allowed,” argues the post. “And there is a good argument for saying that it hasn’t been for over 28 years.”
The ICO (the UK’s data watchdog) published a statement on the A-level exam results on 14 August that said:
“The GDPR places strict restrictions on organisations making solely automated decisions that have a legal or similarly significant effect on individuals. The law also requires the processing to be fair, even where decisions are not automated.”
However, the ICO also disputed that the A-level results qualified as such a case, writing “Ofqual has stated that automated decision making does not take place when the standardisation model is applied, and that teachers and exam board officers are involved in decisions on calculated grades.”
Experts immediately denounced this response, saying that the ICO should be proactively investigating the situation, rather than passively accepting Ofqual’s assertion that solely automated decision making didn’t take place.
Digital rights advocacy group Foxglove is gearing up to stage legal action against Ofqual on behalf of sixth form student Curtis Parfitt-Ford. Foxglove director Cori Crider says that Ofqual’s purpose is to measure individual achievement, but that’s not what the algorithm did – offering another route to a legal refutation of the scoring algorithm.
Ofqual has currently published only “a simplified version” of a Data Protection Impact Assessment (DPIA). A DPIA is a legally required document when conducting a sensitive data-processing scenario, for which the A-level results algorithm would qualify. Experts such as Pat Walshe of Privacy Matters are calling loudly for the production of the full document, which should provide evidence of how Ofqual calculated the risks to students.
Education secretary Gavin Williamson said in an interview on Saturday that there would be “no U-turn, no change” on the A-level results. However, mounting pressure on Number 10 to follow Scotland and reject the algorithmically calculated scores in favour of teachers’ predicted grades has led to the possibility the UK government might change track in an announcement later today.