Judge rules that AI needs to be disclosed in the courtroom

AI’s hallucinations have legal consequences.

Together with

The Future. After AI made up a number of cases in a lawyer’s recent federal filing, a Texas judge has instituted a framework that necessitates accountability when using the tech for legal purposes. While the rule only applies to his courtroom, we wouldn’t be surprised if other judges across the country adopt similar guidelines.

Case fiction
AI’s hallucinations have legal consequences.

  • Last week, lawyer Steven Schwartz used ChatGPT to compile legal research for a federal filing… but it turns out the AI totally fabricated six cases that it cited.
  • On the heels of that goof, Texas federal judge Brantley Starr has issued the “Mandatory Certification Regarding Generative Artificial Intelligence.”
  • The requirement for his courtroom boils down to “generative AI can’t draft legal findings, but if it has been used, it needs to be disclosed and also checked by a human.”

As Techcrunch notes, AI experts have been pushing the tech as a useful tool when summarizing cases for legal use. With Judge Starr’s new framework, lawyers better ensure they can explain everything in them… or face the consequences.

David Vendrell

Born and raised a stone’s-throw away from the Everglades, David left the Florida swamp for the California desert. Over-caffeinated, he stares at his computer too long either writing the TFP newsletter or screenplays. He is repped by Anonymous Content.

TOGETHER WITH CANVA

No design skills needed! 🪄✨

Canva Pro is the design software that makes design simple, convenient, and reliable. Create what you need in no time! Jam-packed with time-saving tools that make anyone look like a professional designer.

Create amazing content quickly with Canva