Local Government Lawyer

 

Local Government Lawyer

GLD March 26 Planning Lawyer Adhoc Banner 600 x 100 px 1

Newsletter registration

* indicates required
 
 
 
 
 
Practice/Interest Area(s) (tick all that apply)
  •  
Join our other mailing lists (tick to subscribe)

Local Government Lawyer, Info-Gov.uk and Public Law Jobs will use the information you provide on this form to send your requested newsletters and updates. Please tick the box below to authorise us to send the email newsletter(s) and alerts requested above.

 

 

You can change your mind at any time by clicking the unsubscribe link in the footer of any email you receive from us, or by contacting us at This email address is being protected from spambots. You need JavaScript enabled to view it.. We will treat your information with respect. For more information about our privacy practices please visit our website. By clicking below, you agree that we may process your information in accordance with these terms.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices.

Lisa Edmunds considers the risks, responsibilities and judicial guidance in relation to parties’ use of AI in the Family Courts.

In the family courts, there has been a noticeable increase in the use of artificial intelligence (AI) tools by parties. Increasingly, parents who are unrepresented are turning to AI tools such as ChatGPT or other generative AI models to assist with drafting documents to be submitted to the court. This is a relatively new challenge for the family courts where AI has become the tool of choice for unrepresented parties, particularly where navigating emotionally difficult and legally complex disputes. As this issue becomes more prevalent, more cases will be reported on this issue. 

AI use by litigants in person: Recent appellate authority

The recent Court of Appeal judgement in D (a child) (recusal) [2025] highlights how AI tools are being used by parties and the courts’ evolving stance on this issue. Here, a mother acting in person submitted an extensive skeleton argument setting out grounds of appeal relating to the recusal of a judge that was prepared with the assistance of AI. The document included a mixture of appropriate case references, misapplied authorities, and some citations that did not exist at all. The court recorded the mother accepted she had used AI to assist in her preparation, and that some of the erroneous authority citations resulted from this.

Judicial sympathy and limits on AI reliance

Lord Justice Baker, with whom Cobb LJ and Miles LJ agreed, articulated sympathy for unrepresented parents who do turn to AI for assistance. As Baker LJ observes, “It is entirely understandable that litigants in person should resort to artificial intelligence for help.” This recognition reflects the practical realities faced by many parents involved in family proceedings where professional legal assistance is unavailable. However, Baker LJ was equally clear that such sympathy does not diminish the responsibility borne by parties to ensure that all material placed before the court is accurate and reliable. At its most serious, reliance on AI-generated content containing ‘hallucinated’ or inaccurate legal authorities risks misleading the court and generating additional emotional and financial costs for all parties, as further judicial time must be spent scrutinising these submissions (Baker LJ, at para. 83). 

Litigants in person and the post-LASPO landscape

From a policy perspective, this case illustrates the broader systemic issues and pressure on the family justice system in England and Wales. Access to legal aid remains restricted, with many parents being unable to afford private representation. For people facing an unfamiliar legal terrain concerning their children, it is unsurprising that parties seek assistance from accessible tools promising to simplify legal research and legal drafting.

The growing reliance on technology to bridge these gaps reflects the environment following the enactment of the Legal Aid, Sentencing and Punishment of Offenders Act 2012 (LASPO), which substantially curtailed the availability of publicly funded family law advice and representation. In response to this unmet need, digital tools are increasingly being used by litigants to bridge the gap and facilitate access to justice, outside regulatory frameworks (Larkin A, ‘AI and Automation in Practice: Insights from Experience, Solutions to Client Pain Points and Adding “Value”’ (2025) 55 Family Law 1369.).

Procedural responsibility and accuracy in family proceedings

However, judicial sympathy sits alongside a clear caveat that litigants in person cannot simply absolve themselves of the responsibility to present accurate material to the court due to representing themselves. Procedural rules all place responsibilities on parties, legally trained or otherwise. 

While AI tools can offer a measure of practical support for parties representing themselves, their use presents a real risk to procedural integrity where inaccurate or fabricated material is placed before the court.

Children’s welfare and the impact of AI-generated submissions

In family proceedings, these risks cannot be viewed in isolation from the welfare principle: delays and misdirected argument caused by flawed AI-generated documents can ultimately impact children’s welfare. Recent research demonstrates AI is no longer a peripheral concern in family law, but one that increasingly intersects with safeguarding and children’s wellbeing.

The challenge for the courts is therefore not whether AI should be outrightly prohibited, but how its use should be managed, ensuring that technological assistance does not undermine the child-focused administration of justice.

Judicial responsibility: AI use and vigilance

The risks associated with AI use are clear where AI models produce plausible but fabricated authorities. These risks were acknowledged in the updating Judicial Guidance on Artificial Intelligence published in October 2025. The guidance warns judicial office holders of the phenomenon described as an “AI hallucination” and urges judges to remain vigilant when engaging with AI-generated material. The concerns raised in D (a child) reflect broader institutional efforts by the judiciary to address the implications of AI across the justice system, to understand common AI terminology, identify it, and assess the risks. 

The future of AI regulation in the Family Courts

Although the guidance is directed at judicial office holders, it acknowledges the growing presence of AI-generated material within court proceedings. This raises further questions as to whether procedural reform or supplementary guidance will be required to address the use of AI by parties directly, particularly litigants in person.

As AI use becomes increasingly embedded in legal practice, the family courts must continue to develop proportionate responses that recognise both the realities of access to justice and the need to safeguard the reliability and fairness of the legal process.

Lisa Edmunds is Head of Chambers at Unit Chambers.

Sponsored articles

LGL Red line

Poll


 

Past issues

Local Government


Governance (subscribe)


Housing (Subscribe)


Social Care and Education (subscribe)

 


Place (subscribe)

 

Wales (subscribe)

Directory