In recent months we at Tees Law have observed an increasing number of clients attempting to rely on legal guidance produced by artificial intelligence (“AI”) tools such as ChatGPT, rather than on advice provided by qualified solicitors. While generative AI has its uses, there is a growing concern that misunderstanding its capabilities and limitations can have serious legal and practical consequences. Strikingly, even those within the profession have been caught out for using AI while preparing for trial which has led to made up and inaccurate citations of case law.
AI tools are not a substitute for professional legal advice. This article explains why, and why clients should be cautious before acting on AI-generated “advice”.
The nature of legal advice
Legal advice is not simply a statement of legal rules; it is the application of those rules to the specific circumstances of a client’s situation. Solicitors undertake careful fact-finding, consider the full context of a matter and the client’s objectives, and apply current law and procedure before arriving on their advice.
AI tools like ChatGPT produce responses by analysing patterns in data they were trained on, but they have no ability to verify, understand, or interpret your unique circumstances in the way a solicitor does. They lack the investigative judgment of a trained lawyer, and they cannot tailor advice to the nuances of your case.
Outdated or inaccurate information
One of the most fundamental limitations of AI tools is that their underlying training data is static and may not reflect the most recent legal developments. Laws, regulations and case law change frequently, and this is a key reason why relying on AI can be dangerous.
Unlike solicitors who are regulated and required to maintain up-to-date knowledge, AI models may provide information based on law that is now obsolete. In legal matters where accuracy is paramount, even slight errors can lead to missed deadlines, incorrect filings, or flawed strategic decisions.
Lack of professional accountability
Solicitors operate under strict professional and ethical obligations. We are regulated by professional bodies, bound by codes of conduct, and protected by professional indemnity insurance.
In contrast, AI tools carry no professional accountability. There is no solicitor-client privilege, no professional indemnity, and no regulator you can turn to if the advice is wrong or harmful. Relying on unregulated outputs erodes the safeguards that the legal system is designed to provide.
Confidentiality and data risks
Confidentiality is a cornerstone of legal representation. Solicitors are legally obliged to protect client information, and communications are generally afforded legal privilege.
AI interfaces are not subject to these protections. What you type into an AI tool is not covered by legal privilege, may be stored or processed by third parties, and could, in some circumstances, be disclosed or used in ways you did not intend. This risks exposing sensitive facts or strategies intended to be confidential to your matter.
AI hallucinations and “false confidence”
A particularly insidious risk with generative AI is the phenomenon known as “hallucination”, where the model produces plausible sounding but completely fabricated or incorrect information, such as non-existent case law or misquoted statutory provisions. AI outputs are often presented confidently and clearly, clients may be misled into placing unwarranted trust in them.
Chowdhury Rahman, a Barrister, was named by The Guardian as one of the culprits in using AI to assist with the preparation of a case held at the Upper Tribunal. As a result, Mr Rahman used case law which did not exist and were not relevant when applied to the facts of the case. This is a prime example of the risk of these “hallucinations”.
Strategic and practical limitations
Even where AI can summarise legal concepts or explain terminology, it cannot perform the practical functions of a lawyer. AI cannot:
- gather evidence and interview witnesses;
- represent you in court or tribunals;
- negotiate with opposing parties;
- file documents with legal authorities; or
- adapt strategy to real-time developments in your case.
These tasks require professional judgment, experience and an understanding of procedural and tactical implications that AI simply does not have.
So, can you use AI for legal advice?
AI tools have their place when used responsibly, for example, to help with preliminary research, to understand general legal concepts, or to prepare questions for your solicitor. However, they should never be relied upon as a substitute for qualified legal advice when it comes to making legal decisions or acting in legal proceedings.
Expert legal and financial advice from Tees
Tees’ expert financial and wealth advisory team work hand in hand with our legal advisers to ensure a joined-up approach to achieving your desired outcomes. Get in touch with us today, and we will help you understand the full picture.

