US Chief Justice Urges ‘Caution And Humility’ With AI

The US Supreme Court. Image credit: US Supreme Court

US chief justice John G Roberts Jr focuses on benefits and risks of AI in the legal system, says human judges will not disappear soon

The US’ chief Supreme Court justice, John G Roberts Jr, on Sunday devoted his annual year-end report to the ambiguous role artificial intelligence (AI) may play in the legal system, urging “caution and humility” in its use.

The report comes days after court papers unsealed last week revealed that Michael Cohen, Donald Trump’s former lawyer, had mistakenly provided false case citations generated by Google’s Bard generative AI that made their way into an official court filing.

Referring to such episodes, Roberts wrote that “any use of AI requires caution and humility”.

He wrote: “One of AI’s prominent applications made headlines this year for a shortcoming known as ‘hallucination,’ which caused the lawyers using the application to submit briefs with citations to nonexistent cases. (Always a bad idea.)”

AI artificial intelligence

‘Caution and humility’

In another, similar incident that was widely reported, a New York lawyer faced disciplinary action after mistakenly submitting a brief that contained references to nonexistent cases that had been fabricated by OpenAI’s ChatGPT.

Roberts noted that AI systems can “earn B’s on law school assignments and even pass the bar exam” and that “legal research may soon be unimaginable without it”.

He said AI had “great potential” to increase access to key information for lawyers and non-specialists alike, but also risks “invading privacy interests and dehumanising the law”.

Mentioning bankruptcy forms, he said some applications could streamline legal filings and save money, smoothing out “any mismatch between available resourcs and urgen needs in our court system”.

‘Fairness gap’

But he said that human assessments, such as in assessing flight risk, recidivism and other discretionary decisions, are perceived by the public to be fairer than those made by AI.

This “human-AI fairness gap” reflects the view “that human adjudications, for all of their flaws, are fairer than whatever the machine spits out”, he wrote.

Legal determinations require an assessment of factors such as the sincerity of a defendant’s allocution at a sentencing, he wrote.

“Nuance matters: Much can turn on a shaking hand, a quivering voice, a change of inflection, a bead of sweat, a moment’s hesitation, a fleeting break in eye contact. And most people still trust humans more than machines to perceive and draw the right inferences from these clues,” he wrote, adding that appellate judges would “not soon be supplanted”.

Nuances and gray areas

“Many appellate decisions turn on whether a lower court has abused its discretion, a standard that by its nature involves fact-specific gray areas,” Roberts wrote, while others focus on “open questions” about how the law should develop in new areas.

By contrast AI is based “largely on existing information”, which he said can “inform but not make such decisions”.