Judge rules that AI needs to be disclosed in the courtroom

AI’s hallucinations have legal consequences.

Together with

The Future. After AI made up a number of cases in a lawyer’s recent federal filing, a Texas judge has instituted a framework that necessitates accountability when using the tech for legal purposes. While the rule only applies to his courtroom, we wouldn’t be surprised if other judges across the country adopt similar guidelines.

Case fiction
AI’s hallucinations have legal consequences.

  • Last week, lawyer Steven Schwartz used ChatGPT to compile legal research for a federal filing… but it turns out the AI totally fabricated six cases that it cited.
  • On the heels of that goof, Texas federal judge Brantley Starr has issued the “Mandatory Certification Regarding Generative Artificial Intelligence.”
  • The requirement for his courtroom boils down to “generative AI can’t draft legal findings, but if it has been used, it needs to be disclosed and also checked by a human.”

As Techcrunch notes, AI experts have been pushing the tech as a useful tool when summarizing cases for legal use. With Judge Starr’s new framework, lawyers better ensure they can explain everything in them… or face the consequences.


E-bike glow-up 🚴‍♂️💨

Want a reconditioned e-bike? Who doesn’t? Take Upway’s quick three-question quiz and find your best bike match. And they’ll zip it to your door in two to five days. 📦

Time to e-ride