New York Attorney Criticized for Using ChatGPT in Lawsuit, 'Bogus' Claims Made

Key Takeaways
  • New York attorney criticized for using AI model, ChatGPT, in Avianca Airlines lawsuit
  • Judge finds fake quotes and citations from ChatGPT in case documentation
  • Incident raises concerns about AI integration, human oversight, and due diligence
New York Attorney Cr

New York Attorney Faces Criticism for Using AI Language Model in Lawsuit Against Avianca Airlines

A New York attorney, Steven Schwartz, is facing criticism for using ChatGPT, an AI language model, for legal research in a lawsuit against Avianca Airlines. The case involves a passenger, Robert Mata, who claims to have been injured by a serving cart during a flight in 2019. However, inconsistencies and factual errors in the case documentation caught the attention of the judge presiding over the case.

Schwartz has admitted to using ChatGPT for his legal research, stating that it was his first time using the AI tool and he was unaware of the potential for false content. In an affidavit, he expressed regret for relying on the AI model without verifying its authenticity.

New York Attorney Criticized for Using ChatGPT AI in Lawsuit, Raises Concerns Over Reliability and Human Oversight

The judge described six of the submitted cases as "bogus judicial decisions" with fake quotes and citations. It was discovered that some of the referenced cases did not exist, and there was a mix-up of docket numbers in one filing. This raised concerns about the reliability of ChatGPT and the need for human oversight and verification.

The incident sparked a broader discussion about the integration of AI, particularly ChatGPT, in various industries. While the intelligence of ChatGPT is rapidly advancing, doubts remain about its ability to completely replace human workers. Syed Ghazanfer, a blockchain developer, expressed support for ChatGPT but highlighted that it lacks the communication skills necessary to fully understand and meet complex requirements.

As the case unfolds, the incident serves as a cautionary tale highlighting the importance of human involvement and due diligence when utilizing AI tools like ChatGPT in professional settings. 

Also read - Stablecoin Market Shift: Tether Surges, USDC Struggles, Regulatory Impact Felt

WHAT'S YOUR OPINION?
Related News
Related Blogs