Jason Boleman//November 3, 2025//
In brief
As the artificial intelligence age continues to blossom and lawyers continue to find ways to adapt the technology to their practices, some concerns persist on possible pitfalls to using the technology.
It is a situation that whistleblower attorney Thad Guyer is well aware of.
In 2024, Guyer faced sanctions after filings in Iovino v. Michael Stapleton Associates Ltd. contained citations to cases and quotations that the court and the other party in the case could not find.
U.S. District Court Judge Thomas T. Cullen wrote at the time that the filings represented “ChatGPT run amok.”
“This was not a situation of hallucinated cases,” Guyer said. “That’s the death knell for a lawyer. Both of the cases that were raised in my case were real cases, but they were miscited and misquoted.”
Guyer explained to Virginia Lawyers Weekly that the problem arose after the paid cite checker he used produced a misquotation that he did not catch in time.
“I didn’t check the quotations, where the quotation marks were put in the cases, and that caused the problems I ended up having,” Guyer said.
Ultimately, Guyer was not sanctioned but received a private reprimand by the Virginia State Bar for the misquoted cases.
“I think people should know that just because you get past the Rule 11 sanctions doesn’t mean that it’s over for you, and that if you want the system to treat you with good faith, you have to very quickly, right from the outset, admit you made a mistake,” Guyer said.
That “ChatGPT run amok” case was among the first examples in Virginia caselaw of an attorney catching heat for erroneous filings created by artificial intelligence.
Sometimes, such filings are “AI hallucinations,” a growing phenomenon where litigants file documents citing court cases that do not exist. As the use of AI continues to grow and the tools become more ubiquitous in American life, Virginia attorneys say it is crucial to avoid running afoul of ethical rules by filing inaccurate documents.
“At its core, generative AI is a very educated guess on letter and word order,” Annandale attorney Reid Trautz, who chairs the Virginia State Bar Special Committee on Technology and the Future Practice of Law, said. “Sometimes it comes back with an answer that’s not real and not based on anything.”
Richmond attorney Cullen Seltzer, who presents CLE on AI and has spoken to college classes about AI and the practice of law, said AI is among the most transformational technologies he has seen in his career.
“I don’t think there’s been any technological innovation in my more than 30 years of practice that has unfolded over three years and been so transformative of our society writ large and our industry writ small,” he said.
But by the same token, Seltzer said, “we’ve all seen the proliferation of misuse of the technology, and lawyers being very embarrassed at having been caught out citing hallucinated cases.”
The problem of AI hallucinations was brought under the microscope in late October when two federal judges faced inquiries by the Senate Judiciary Committee after using AI to draft inaccurate court documents including fictional quotes and litigants.
“The judges signed the decisions, and therefore those judges are certifying that these cases are real,” Guyer said. “But the reality is, no federal judge has the time to be able to go through it. So, everybody is going to rely on somebody else to do these validations, and that’s where this whole system is going to break down.”
Seltzer noted that by nature, AI tools can create products that appear perfect on their first attempt.
“The real risk with AI is that it will generate what kind of looks like a finished product in the first iteration,” he said. “We’re all busy, you can very easily find yourself pressed for time, and the robot has given you about 10 pages that look pretty good.”
Trautz advised thinking of generative AI in the same way an attorney may think of the work done by a young law clerk or assistant.
“We have to check behind on their work. We can’t trust them to know everything,” Trautz said.
Many of the examples in Virginia case law of AI hallucinations have involved pro se litigants who use the technology to help themselves generate legal filings.
That situation creates a pitfall for those litigants, as well as clients who use the technology, as lawyers caution they lack the specialized knowledge to recognize an AI hallucination or inaccuracy.
Through his work on the special committee, Trautz said the bar “may have a role along with the judiciary to try to warn clients away from doing this.”
“Otherwise, they are going to be filing briefs and documents that are just going to confound the court and clog the system for all of us,” he said. “So, we may have a role in educating consumers about the proper use of AI, and generally that’s not [using] AI alone, it’s AI in conjunction with a lawyer.”
“The public sometimes thinks that these large language models can replace a lawyer,” Trautz said. “But they’re getting into trouble when they’re filing pro se because they’re not checking the case citations.”
Seltzer said while pro se litigants may be more vulnerable to AI hallucinations, it is not unique to them.
“It’s the same way that you can show me architectural renderings of a building that might look all fancy, and I think that looks pretty good, but a trained engineer would look at it and say ‘well, that’s not a building,’” he said.
The specific issue lends itself to a broader conversation about the future of the profession, with some concerned about what AI can do in the future that may render parts of the practice of law obsolete.
Despite the sea change affecting the profession, Trautz expressed confidence that the need for qualified legal professionals will persist.
“The quote I hear most often is that AI is not going to replace lawyers, but lawyers who use AI will replace lawyers who don’t, and I think that’s the direction we are headed,” he said.
As part of his private reprimand, Guyer was required to attend two hours of AI ethics classes that he said got him so interested in the new technology that he listened to more than 20 hours of CLE on the topic.
“I’m fixated on them now, and I’m putting together lesson plans for young interns that I have,” he said.
Guyer said the technology is “unbelievable in coming up with legal arguments” and has sought to integrate it even more into his practice.
“I told Judge Cullen at the time of my ‘ChatGPT run amok’ that AI was generating 75% of my work product, but I told him my goal was to get to 90% of my work product, and that’s where I am now,” Guyer said.
He said he comes up with the arguments and context, but that ultimately AI keyboards 90% of the written copy he produces.
“AI is not going to take your job. Lawyers who use AI are going to take your job,” Guyer said. “This is the motto throughout, and it’s true.”
This embrace of AI mirrors the profession at large, which is seeing consumer demand for AI proficiency rise as well.
“There will be no Fortune 500 company out there whose general counsel is going to give their work to a law firm that doesn’t use AI, because the law firm that doesn’t do AI is not financially competitive,” Guyer said.
Trautz said the special committee is working on looking at how the practice of law will evolve, adding that AI is “a technological force that I have not seen in my 38 years of being a lawyer.”
“This is the biggest, strongest technological revolution that I have seen, and it is and will continue to impact the legal profession and the courts as well,” he said.
Referencing proposed Legal Ethics Opinion 1901, which addresses the reasonableness of a fee in conjunction with time spent on a matter, Seltzer said the profession will have to adapt as it embarks on the new AI era.
“The world is starting to expect, very quickly, to realize the benefits and promises of this tech,” he said. “And so there will be extrinsic pressure on lawyers to give the market what it wants.”