INDIAN High Court judge Anoop Chitkara has ruled over thousands of cases. But when he refused bail to a man accused of assault and murder, he turned to ChatGPT to help justify his reasoning.
He is among a growing number of justices using artificial intelligence (AI) chatbots to assist them in rulings, with supporters saying the tech can streamline court processes, while critics warn it risks bias and injustice.
"AI cannot replace a judge... However, it has immense potential as an aid in judicial processes," said Chitkara.
"The knowledge revolution has started, and these AI platforms have in certain situations demonstrated their capabilities to instantaneously transform queries into outstanding results."
Chatbots like ChatGPT and Google's Bard are software applications designed to mimic human conversation in response to users' questions.
Chitkara said he did not rely on ChatGPT to help decide his ruling in the 2020 case at the Punjab and Haryana High Court.
However, he wondered if he was relying too heavily on his "consistent view" that allegations involving an unusually high level of cruelty should count against granting bail and asked ChatGPT to summarise case law on the issue.
The use of AI in the criminal justice system is growing quickly worldwide, from the popular DoNotPay chatbot lawyer mobile app to robot judges in Estonia adjudicating small claims and AI judges in Chinese courts.
In the Caribbean Colombian city of Cartagena, judge Juan Manuel Padilla also turned to ChatGPT for help in a lawsuit in which an autistic boy's parents were suing his healthcare provider for treatment costs and expenses.
He asked the chatbot several legal questions such as whether an autistic child is exempt from fees for therapy. He included the details in his ruling, which sided in favour of the child.
But chatbots' reliability is questionable, said several legal and tech experts.
"Some judges are trying to find a way to make the job faster — but they don't always know the limits or risks," said Juan David Gutierrez, professor of public policy and data at Universidad del Rosario, Bogota.
"ChatGPT can make up laws and rulings that don't exist. In my view, it shouldn't be used for anything important."
There have been numerous examples of chatbots getting information wrong or making up plausible but incorrect answers — which have been dubbed "hallucinations" — such as inventing fictional articles and academic papers.
Better technology promises a way to alleviate the huge backlog that is clogging some legal systems.
India had more than 40 million cases pending in lower courts last year, while Brazil had 26 million new lawsuits filed in 2020 alone — more than 6,000 per judge.
But AI risks over-simplifying complex problems and could raise unrealistic expectations of tech's capabilities, wrote Dona Mathew and Urvashi Aneja from the research collective Digital Futures Lab in a recent report.
There are also concerns over privacy violations and exploitation of judicial data for profit.
"With biased and incomplete datasets, no legal remedies and accountability safeguards... These changes can lead to systematic harms like threats to judicial independence and stagnation of legal principles," they wrote.
Raquel Guerrero, a lawyer for three journalists in Bolivia accused of posting photos of a victim of violence without their permission, expressed concerns when the court consulted ChatGPT during an online hearing in April.
Guerrero said the complainant gave permission for the photos to be shared online, but later denied she had done so.
Constitutional judges asked ChatGPT about any "legitimate public interest" for journalists posting online photos of a "woman showing parts of her body" without her consent.
ChatGPT answered it was a "violation of the person's privacy and dignity". The judges ordered the photos to be removed from social media.
The court record said ChatGPT did not replace decisions made by jurists, but that it could be used as additional support to be able to "clarify certain concepts".
But, Guerrero said, the chatbot's use in the hearing was "arbitrary" and a "disaster".
"It can't be used as if it's a calculator that takes away the obligation of judges to use reason and to apply justice and to apply it correctly," Guerrero said, adding that she was considering filing a complaint against the judges for using the chatbot.
The writers are from the Reuters news agency