A New York lawyer has been ordered to explain why he should not be disciplined after using ChatGPT for legal research.
The original case in question involved a man suing an airline over an alleged personal injury. His legal team, made up of lawyers from the firm Levidow, Levidow & Oberman submitted a court document on his behalf. The document contained a list of legal cases to prove, using precedent, why the case should be considered.
However upon closer inspection, the airline’s lawyers said they could not find a number of the cases mentioned in the brief.
“Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations,” Judge Castel wrote in an order demanding the man’s legal team explain itself.
More from News
- Trump Lifts Sanctions in Syria: What Does This Mean For Syrian Businesses?
- Retail Cyber Attacks: Cartier And North Face Are The Next Retailers Affected
- A Look At The Different Technologies Volvo Is Bringing To Its Cars
- Klarna Launches Debit Card To Diversify Away From BNPL
- T-Mobile Now Has Fibre Internet Plans Available For Homes
- Bitdefender Finds 84% of Attacks Use Built In Windows Tools, Here’s How
- Japan Starts Clinical Trials For Artificial Blood Which Is Compatible With All Blood Types
- UK Unicorn Monzo Breaks £1 Billion in Revenue
Despite ChatGPT having a disclaimer that it can produce inaccurate information, the lawyer who used the tool told the court he was “unaware that its content could be false”.
Mr Schwartz, a lawyer at the firm who prepared the document, said that he “greatly regrets” using the chatbot, which he had never used before.
He has vowed to never use AI to “supplement” his legal research in future “without absolute verification of its authenticity”.
The event comes as a warning that although AI like ChatGPT is a useful tool for information gathering, it should be used with caution, especially in professional environments.
Whilst it can provide human-like responses, it uses the internet as it was in 2021 as its database meaning some of its information can be outdated.