The popular AI chatbot was blamed for the lawyers’ submission of fictitious legal research in one of the most bizarre uses of the tech so far.
Judge P Kevin Castel said they acted in bad faith. But he credited their apologies and remedial steps taken in explaining why harsher sanctions were not necessary to ensure they or others won’t again let artificial intelligence tools prompt them to produce fake legal history in their arguments.
“Technological advances are commonplace and there is nothing inherently improper about using a reliable artificial intelligence tool for assistance,” Castel wrote.
“But existing rules impose a gatekeeping role on attorneys to ensure the accuracy of their filings”.
The judge said the lawyers and their firm, Levidow, Levidow & Oberman, PC, “abandoned their responsibilities when they submitted non-existent judicial opinions with fake quotes and citations created by the artificial intelligence tool ChatGPT, then continued to stand by the fake opinions after judicial orders called their existence into question”.
In a statement, the law firm said it would comply with Castel’s order, but added: “We respectfully disagree with the finding that anyone at our firm acted in bad faith.
“We have already apologized to the Court and our client. We continue to believe that in the face of what even the Court acknowledged was an unprecedented situation, we made a good faith mistake in failing to believe that a piece of technology could be making up cases out of whole cloth”.
The firm said it was considering whether to appeal.
‘Gibberish’ and ‘nonsensical’
Castel said the bad faith resulted from the failures of the attorneys to respond properly to the judge and their legal adversaries when it was noticed that six legal cases listed to support their March 1 written arguments did not exist.
The judge cited “shifting and contradictory explanations” offered by lawyer Steven A Schwartz. He said lawyer Peter LoDuca lied about being on holiday and was dishonest about confirming the truth of statements submitted to Castel.
At a hearing earlier this month, Schwartz said he used the artificial intelligence-powered chatbot to help him find legal precedents supporting a client’s case against the Colombian airline Avianca for an injury incurred on a 2019 flight.
Microsoft has invested some $1 billion (€918.5 million) in OpenAI, the company behind ChatGPT.
The chatbot, which generates essay-like answers to prompts from users, suggested several cases involving aviation mishaps that Schwartz hadn’t been able to find through usual methods used at his law firm.
Several of those cases weren’t real, misidentified judges or involved airlines that didn’t exist.
The judge said one of the fake decisions generated by the chatbot “have some traits that are superficially consistent with actual judicial decisions,” but he said other portions contained “gibberish” and were “nonsensical”.
In a separate written opinion, the judge tossed out the underlying aviation claim, saying the statute of limitations had expired.
Lawyers for Schwartz and LoDuca did not immediately respond to a request from the Associated Press for comment.
THE ROTTEN FISH: CAN OF WORMS OPENED OF APC & TINUBU'S GOVERNMENT OVER NIGERIA'S ECONOMIC DOWNTURN
WATCH THE CRITICAL ANALYSIS AND KNOW THE RESPONSIBLE PARTIES TO BLAME FOR NIGERIA'S ECONOMIC CHALLENGES, WHILE CITIZENS ENDURE SEVERE HARDSHIPS.Watch this episode of ISSUES IN THE NEWS on 9News Nigeria featuring Peter Obi's Special Adviser, Dr Katch Ononuju, 9News Nigeria Publisher, Obinna Ejianya and Tinubu Support Group Leader, McHezekiah Eherechi
The economic crisis and hardship in Nigeria are parts of the discussion.
Watch, leave your comments, and share to create more awareness on this issue.
#9NewsNigeria #Nigeria #issuesInTheNews #politics #tinubu THE ROTTEN FISH: CAN OF WORMS OPENED ...
DON'T FORGET TO SUBSCRIBE AND LEAVE YOUR COMMENTS FOR SUBSEQUENT UPDATES
#9newsnigeria #economia #economy #nigeria #government @9newsng
www.9newsng.com