Following the usage of ChatGPT, an AI tool for legal research, by his company, a New York attorney is now facing a court hearing.After it was discovered that a submission used hypothetical legal cases as examples, the judge said that the situation was “unprecedented” and indicated the court was dealing with it.The attorney who utilised the tool testified before the court that he was “unaware that its content could be false”.
ChatGPT generates original text when requested, but it is noted that this can “produce inaccurate information”. In the initial instance, a man sued an airline for what he claimed to be a personal injury. His legal team filed a brief that cited several earlier court cases to establish, through precedent, why the case should proceed.
However, the airline’s attorneys informed the judge in a letter that they could not locate several cases cited in the brief. In an order requiring the man’s legal team to respond, Judge Castel ruled that “six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations.”
Over the course of numerous filings, it became clear that the research had not been done by plaintiff’s attorney Peter LoDuca but rather by one of his coworkers at the same legal office. ChatGPT was utilised by Steven A. Schwartz, a lawyer with more than 30 years of experience, to find instances that were similar to the one at hand.
Mr. Schwartz made it clear in his written statement that Mr. LoDuca was not involved in the research and was unaware of how it was conducted.Mr. Schwartz continued, saying he “greatly regrets” having used the chatbot for his legal research because he was “unaware that its content could be false” and had never used one before.
He has sworn never again to “supplement” his legal research with AI “without absolute verification of its authenticity”.A dialogue between Mr. Schwarz and ChatGPT appears to be captured in screenshots attached to the petition.
One comment asks, “Is Varghese a real case?,” referring to one of the cases that no other lawyer was able to locate, Varghese v. China Southern Airlines Co Ltd. “S” queries “What is your source” after ChatGPT replies that, indeed, it is.
The case is real and can be found on legal reference databases like LexisNexis and Westlaw, ChatGPT replies after “double checking” the information.It claims that the further cases it gave Mr. Schwartz are also true.At a hearing on 8 June, both solicitors from the company Levidow, Levidow & Oberman have been compelled to defend their conduct.
Since its debut in November 2022, ChatGPT has been used by millions of users.It may imitate various writing styles and respond to queries in language that seems natural and human. Its database is the internet, as it existed in 2021.Concerns have been raised about the possible dangers of artificial intelligence (AI), including the possibility of bias and false information spreading.