A federal court in Utah dismissed a long-running False Claims Act suit against our clients—a leading anesthesiology practice and four of its physicians—following the government’s decision to exercise its rarely used authority to seek dismissal of claims asserted by a qui tam relator.
Over several years of litigation, our team built a strong evidentiary record showing that the relator’s claims lacked merit, including by securing testimony from a government agency that contradicted the relator’s allegations that our clients improperly billed Medicare and Medicaid for anesthesia services. Then, in July 2025, our team uncovered AI-generated hallucinations in a report submitted by one of the relator’s experts—including fabricated government testimony, fake citations, and made-up industry publications. We also obtained testimony from the expert acknowledging his use of ChatGPT. After our team brought these issues to the court’s attention, the relator withdrew the expert. Our team then moved for sanctions, seeking disqualification of both the relator and his counsel.
While the motion for sanctions was pending, the Department of Justice moved to intervene in the case to seek dismissal, explaining to the court that continued litigation was “not in the government’s interests.” The court granted the government’s motion, resulting in a complete victory for our clients.
This case was featured in the Law360 article, “FCA Suit Tainted By Expert’s AI ‘Hallucination’ Gets Dismissed,” published on September 30 (subscription required). This case was featured in the October 3 installment of Law360’s Legal Lions Of The Week.