Lee Castleton remembers to the last penny the shortfall that appeared on his post office terminal on New Years Eve 2003 – £1,103.68. A week later another loss emerged, this time £4,230. Then another and another. In March, the sub-postmaster was £25,000 short. “I knew from the second loss that it wasn’t a mistake on my part,” Castleton said.
With no way to query the Post Office’s back-office systems, he called and emailed the IT helpline and its managers – 91 times. Yet all he received were instructions to perform checks he’s done dozens of times, and after a few bland reassurances, superiors stopped responding altogether.
A technical engineer, then briefly a stockbroker, Castleton had bought a post office in the northern English seaside town of Bridlington, hoping to provide a lifestyle his young family would enjoy. Instead, a High Court judgment put him out of business ordering him to pay the Post Office the £321,000 spent suing him for an illusory debt. The bankruptcy paid the return to the stock market. So he had to fend for himself as an engineer on the job, sometimes sleeping in his car, in a hand-to-hand struggle to meet the mortgage payments on the family’s flat above their now-defunct post office.
How a bug-ridden computer system led Britain’s public post office to prosecute more than 700 deputy postmasters for thefts they didn’t commit, and bankrupt others, is now the subject of a public survey.
The episode adds to a growing global backlash over the human damage automated processes can cause. In the United States, a group of White House science advisers are calling for a “bill of rights” to guard against the injustices caused by artificial intelligence.
Much of this focuses on how AI-powered algorithms can amplify societal biases, such as female job seekers being sidelined in male-dominated fields and black citizens being profiled by AI tools for their risk of recidivism, receiving harsher prison sentences from judges. Yet digital injustice is not limited to AI, nor is it a new phenomenon. In 2011, the UK government apologized to the relatives of two Royal Air Force pilots accused of the fatal 1994 Chinook helicopter crash, which campaigners claimed faulty software may have caused.
All of this begs the question of how truth is established in disputes between people’s words and the reliability of computers.
“When patients are injured, staff are often blamed,” writes Harold Thimbleby, Emeritus Professor of Science at Swansea University. People overlook the other suspect in the room, which is faulty technology or hidden human-computer interaction. His new book, fix it, describes such a case. During a routine investigation, a hospital discovered a discrepancy between measurements automatically uploaded to a database from clinical devices and paper notes from nurses. Knowing that computers don’t lie, the hospital accused its nurses of creating fraudulent patient records, and several were put on trial. Yet three weeks later, an IT support engineer from the device vendor caused the trial to collapse when he revealed, under cross-examination, that he had “tidy up” the database, which was poorly maintained, deleting records.
In the Post Office scandal, a mix of faulty software, improper disclosure and lies, aided and abetted by a legal presumption that computers operate reliably, ruined hundreds of lives, said Paul Marshall, a lawyer who acted, pro bono, for three convicted post office workers. in the court of appeal. For more than a decade, judges and juries have trusted the word of Post Office witnesses that its Horizon accounting system, provided by computer specialist Fujitsu, was reliable and inferred that deputy postmasters and mistresses must have stolen money he had recorded as missing. Yet in 2019, the disclosure of error logs known to affect Horizon, and which existed from the start, led a more curious judge to conclude that the system was “not robust at all”.
The presumption of computer reliability requires anyone who challenges digital evidence to show that the computer is untrustworthy. This can be “extremely difficult” when defendants lack computer literacy and access to systems, says Martyn Thomas, professor emeritus of computer science at Gresham College. Computers can also misbehave, while appearing to work perfectly. It was “Catch-22” that trapped post office workers, Marshall says. “They had no reason to question what the computer was spitting out, because they didn’t know how it worked or its propensity to fail and the post didn’t say so.”
Asking the right questions is also important when email evidence is challenged. In 2018, Peter Duffy, a consultant urologist, won a constructive dismissal claim against Morecambe Bay NHS Foundation Trust teaching hospitals. He then published a book alleging misconduct over a patient’s death, prompting NHS England and NHS Improvement to commission an external investigation.
The 2020/21 investigation revealed two emails, allegedly sent by Duffy in 2014, as the patient deteriorated. Duffy says the emails were fabricated. However, following their entry into the record, he became implicated in the poor care of the patient.
In a statement, Aaron Cummins, chief executive of UHMBT, said “two separate independent external reviews” for the investigation “found no evidence that the emails in question were tampered with and no evidence that they were not sent from Mr Duffy’s NHS hospital email account.”
Yet during Duffy’s 2018 labor court, a judge ordered the trust to search for any correspondence regarding the patient’s death. None of the trust’s digital searches produced the disputed emails. The emails also do not appear in information gathered by two internal NHS inquiries into the patient’s death, or in responses to Freedom of Information requests made by the deceased patient’s family and Duffy himself. same.
“How can an organization’s cybersecurity assessment today authenticate emails purportedly sent six years earlier, but not acknowledged or processed by recipients, and at odds with contemporary clinical notes and recollections? bereaved family? Duffy asks.
Without commenting on Duffy’s particular case, Thimbleby says that when digital searches have been carried out and a court has been told that there are no more emails to be found, “you cannot assume ‘authenticity”. There must be strong evidence that the emails existed, “like backups”.
From banking apps to algorithms that inform hiring choices, computer-controlled systems have entered our daily lives in countless small ways since the first Post Office lawsuits. Yet, as the reach of technology has advanced, the same cannot be said for the law’s ability to deal with its failures: “You can become a lawyer without knowing anything about electronic evidence, although it is part of almost every court case,” says Stephen Mason. , co-author of Electronic evidence and electronic signatures, a lawyers handbook. “It really matters,” Marshall said, citing the jailing of Deputy Postmaster Seema Misra for the alleged theft of money that the post office’s Horizon system showed was missing. “Four times before three different judges,” Marshall says, Misra requested disclosure of Horizon’s error records, and was denied. A decade later, records of errors led to his conviction being overturned.
In a 2020 paper, submitted to the Department of Justice, Marshall and several co-authors recommend reviewing the legal presumption of computer reliability. Assuming that all computer software contains bugs, they strive to forestall instances of injustice without cluttering courtrooms with hopeful trials, such as motorists demanding software investigations into speed cameras.
The document recommends that organizations that rely on computer-generated evidence be required, as standard procedure, to disclose their system’s error logs and known vulnerabilities. For well-run operations, it should be straightforward, says Thomas of Gresham College, otherwise “the onus should be on [the organisations] to show that it was not the computer that made things go wrong”.
While companies often hire computer consultants to provide expert advice in court cases, individuals can rarely afford expert fees. To reduce inequity, Swansea University’s Thimbleby suggests setting up independent computer panels to advise judges on when digital evidence can reasonably be relied on and when it cannot. “In the world I envision, people could say ‘this is clearly an IT problem and we have the right to use the IT panel’ and the panel would take an informed view,” he says. .
Had such a system been in place when the Post Office brought its actions, the Castletons might have lived a very different life. Now a factory engineer working night shifts, rather than a businessman, Castleton says he ran into a company that wouldn’t bow. “I felt like I was drowning and no one was doing anything to save me. I was just insignificant.