The Risks of Artificial Intelligence on English Common Law

AI head and legal scales of power

On the 23rd of May, I attended Frederick Ayinde v London Borough of Haringey and Hamad Al-Haroun v Qatar National Bank QPSC[1] at the Royal Courts of Justice. The judges presiding over matters were Dame Sharp, the President of the King’s Bench Division, and Mr Justice Johnson. 

Ayinde[2] and Al-Haroun[3]exposed a growing threat to English common law. How are the courts to deal with non-existent cases generated by Artificial Intelligence? 

English common law is built on precedent

English common law is built on centuries of judicial decisions. The doctrine of judicial precedent (Stare decisis) binds judges to follow the decisions made by higher courts and courts of equal authority. Judicial precedent ensures consistency and predictable outcomes. It enables a lawyer to brief their client with a high degree of certainty regarding the merits of their matter.

Ayinde case

In Ayinde, a homeless man challenged Haringey Borough Council's refusal to provide him with temporary accommodation. His barrister, Ms Sarah Forey, prepared his submission for judicial review. Ms Forey was a pupil barrister when instructed for Ayinde. In her submission, she cited five case authorities that did not exist. Justice Ritchie noted that the cases could not be found in any law report[4]

Al-Haroun case

Al-Haroun v Qatar National Bank involved a £89 million claim. Mr Al-Haroun, the claimant, submitted a witness statement citing 45 non-existent case authorities. He admitted to using AI. However, his solicitor, Mr Abid Hussain, also submitted a witness statement in support of his client's claim. Mr Hussain's statement also cited non-existent authorities. The solicitor claimed he had relied on his client's "research" and that he was embarrassed and horrified to find that the cases were false.

Why does Artificial Intelligence make up cases? 

Large language models, such as ChatGPT, operate by using complex mathematical equations to predict the next word rather than retrieving facts. Simply put, AI hallucinates because it is constantly making guesses. When prompting an AI tool, the generated response may appear plausible, but it is not necessarily the truth. So, if you ask an AI tool about a legal problem, the tool does not consult a legal database; instead, it predicts text that appears to be real based on information in its training data.

AI tools are not yet smart enough to say, "I don't know". If a concept is not in its training data, the AI will generate a plausible-sounding answer instead of stating that it does not know. Even when a user is wrong, the AI will affirm the user's false thesis. In Al-Haroun, the claimant asked an AI tool for legal references, and the tool invented the references because no existing cases supported his argument. 

In a statement that may resolve the debate over whether AI will replace lawyers, Dame Sharp stated that: "Large language models such as ChatGPT are not capable of conducting reliable legal research[5].

The path forward

Dame Sharp and Justice Johnson found that Ms Forey had met the tests for contempt. The court held that there were two possible outcomes: either Ms Forey deliberately used false citations, or she used an AI tool and was not being truthful[6]. However, the court decided not to initiate contempt proceedings. Instead, Dame Sharp referred Ms Forey to the Bar Standards Board.

In Al-Haroun, the court referred Mr Hussain and his firm to the Solicitors Regulation Authority. Dame Sharp found Mr Hussain’s explanation extraordinary. She stated that a solicitor should never outsource legal research to a client, especially one who is not legally trained.

The court warned barristers' chambers and law firms to expect the court to inquire whether they have fulfilled their leadership responsibilities in future cases.

Conclusion 

Judicial precedent relies on the authenticity of cited case authorities. When Artificial Intelligence invents a legal precedent, it threatens to erode trust in the entire system. The lesson from these cases is clear: AI cannot replace diligent legal research. Technology may assist research, but it is no substitute for human judgment and expertise. AI tools can enhance efficiency, but legal practitioners must be aware of their limitations.

References

[1] R (on the application of Frederick Ayinde) v Haringey London Borough Council; Al-Haroun v Qatar National Bank QPSC and another company [2025] EWHC 1383 (Admin).

[2] R (on the application of Frederick Ayinde) v Haringey London Borough Council [2025] EWHC 1040 (Admin).

[3] Al-Haroun v Qatar National Bank (QPSC and another 

[4] R (on the application of Frederick Ayinde) v Haringey London Borough Council [2025] EWHC 1040 (Admin) at [65].

[5]R (on the application of Frederick Ayinde) v Haringey London Borough Council; Al-Haroun v Qatar National Bank QPSC and another company [2025] EWHC 1383 (Admin) at [6]. 

[6] R (on the application of Frederick Ayinde) v Haringey London Borough Council; Al-Haroun v Qatar National Bank QPSC and another company [2025] EWHC 1383 (Admin) at [68].

 

Thomas Barry

Thomas is a final-year Law student at The Open University. After leaving the British Army, Thomas took his first steps into education by completing an OU Certificate of Higher Education in Law and French, earning a distinction which gave him the confidence to pursue a full law degree.

Thomas’ journey took a remarkable turn when he secured the prestigious Freshfields Stephen Lawrence Scholarship, aimed at increasing diversity in corporate law. Selected from 3,000 applicants, he gained invaluable experience in the legal field, ultimately deciding to specialise in corporate law. His hard work paid off when he landed a training contract with global law firm Freshfields LLP.

Thomas credits the OU for opening doors to opportunities he never imagined, encouraging others to take the leap into higher education.