NewsNationalScripps News

Actions

Judge bans ChatGPT from courtroom after lawyer's mishap

The attorney used ChatGPT to "supplement" his legal briefing, but the popular AI tool provided him with several cases that were completely made up.
Judge bans ChatGPT from courtroom after lawyer's mishap
Posted
and last updated

After an attorney using artificial intelligence presented false information in federal court last week, the judge in the case is taking a stance to — hopefully — prevent it from happening again.

Attorney Steven Schwartz used ChatGPT to "supplement" his legal briefing, but the popular AI tool provided him with several cases that were completely made up. Schwartz apologized, saying he "greatly regrets" the mishap, but Judge Brantley Starr is taking steps to make sure it's a one-time incident.

The judge said any lawyer presenting a case at the U.S. District Court for the Northern District of Texas must confirm that no part of their filing was drafted by generative AI — such as ChatGPT. If it was, the party must confirm it was fact-checked "by a human being."

"These platforms in their current states are prone to hallucinations and bias," the judge's order reads. "On hallucinations, they make stuff up—even quotes and citations. Another issue is reliability or bias. While attorneys swear an oath to set aside their personal prejudices, biases, and beliefs to faithfully uphold the law and represent their clients, generative artificial intelligence is the product of programming devised by humans who did not have to swear such an oath."

SEE MORE: Could an AI attorney change the law field as we know it?

Starr acknowledged that there are appropriate uses for AI in court — such as form divorces, discovery requests or finding errors in documents — but legal briefings are not one of them. He added that these tools can provide fabricated quotes and citations, making them unreliable. 

"Unbound by any sense of duty, honor, or justice, such programs act according to computer code rather than conviction, based on programming rather than principle," the order reads. "Any party believing a platform has the requisite accuracy and reliability for legal briefing may move for leave and explain why." 

The judge added that any party attempting to use AI in legal briefings must now file a certificate on the docket attesting it had been reviewed by a human "using print reporters or traditional legal databases."


Trending stories at Scrippsnews.com