Law firm's attempt to validate fees with ChatGPT backfires dramatically. This serves as a warning for legal practitioners integrating technology into their practice.

(Photo : STEFANI REYNOLDS/AFP via Getty Images)
This photo illustration shows the ChatGPT logo at an office in Washington, DC, on March 15, 2023.

Using ChatGPT to Justify Fees

In a recent legal debacle, New York-based firm Cuddy Law faced backlash after employing OpenAI's ChatGPT to justify its fees.

The firm represented the plaintiff in a lawsuit against the New York City Department of Education (DOE), alleging failure to provide the plaintiff's child with a free appropriate public education, resulting in a victory. 

The child had previously been diagnosed with attention deficit hyperactivity disorder (ADHD), language disorder, developmental coordination disorder, and acute stress disorder.

Despite ChatGPT-4's reputation for making mistakes, Cuddy Law tried to use it to justify their fees in a trial they won. But Judge Paul Engelmayer didn't agree. He didn't just reject a big part of the fees they asked for; he also criticized the law firm for relying on ChatGPT.

Judge Engelmayer said it's risky to trust ChatGPT because it can't tell if the information it gives is real or made up. He mentioned other cases where lawyers got in trouble for using ChatGPT to make fake legal documents.

This shows how using AI like ChatGPT in law can be risky. Even big names like Michael Cohen, who used to be Donald Trump's lawyer, have been caught using AI to make fake legal stuff.

Law Firm's Defense

Cuddy Law justified its use of ChatGPT by arguing that it acted as a secondary source to validate their fees, in line with typical legal billing standards where the losing party covers these costs.

According to representative Benjamin Kopp through The Register, their implementation of AI differed significantly from creating false court documents; rather, it served as an additional tool for reference. 

Despite this, Judge Engelmayer remained skeptical, noting the law firm's failure to disclose the specifics of ChatGPT's fee assessment or whether any data used was artificially generated.

The judge deemed Cuddy Law's request for $113,484.62 in fees as unacceptable and advised them to refrain from relying on the AI tool in future fee applications.

In his 34-page ruling, the judge criticized the incorporation of ChatGPT's conclusions as a reliable indicator of appropriate billing rates, particularly in niche legal areas. 

He also questioned the transparency of Cuddy Law's fee structure, highlighting the lack of clarity regarding the inputs used alongside ChatGPT and the synthetic nature of the data employed. 

These concerns underscore broader issues surrounding the accountability and accuracy of AI-generated information within legal proceedings.

Also Read: OpenAI Security Head Suggests ChatGPT Can Decrypt Russian Hacking Group Conversations in Pentagon Event

Cuddy Law ultimately saw a significant reduction in the fees awarded, receiving only $53,050.13 instead of their initial request.

This case serves as a warning for legal practitioners seeking to integrate technology into their work. While AI tools like ChatGPT hold promise for legal research and analysis, the dangers of relying on them without proper validation are clear.

Judge Engelmayer's firm stance in rejecting ChatGPT's findings underscores the importance of transparency, thoroughness, and responsible utilization of AI in the legal profession. 

Related Article: Former Trump Lawyer Michael Cohen Admits to Using Bard, AI-Generated Court Cases-Hallucinations?

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion