ARTICLE
29 November 2023

AI Should Assist Trial Lawyers, Not Replace Their Judgment

AV
Axinn Veltrop & Harkrider

Contributor

Incisive. Inclusive. Invested. We’re Axinn.

Experienced, tenacious, and always trial-ready, we are committed to understanding complex legal challenges that impact the future of our clients’ businesses, globally.

Focusing on antitrust, intellectual property, and high-stakes litigation, our extensive teams in the U.S. possess deep knowledge and client-side experience across a range of sectors, including technology, healthcare, life sciences, and consumer products.

With a strong culture of diversity, equity, and inclusion, we build relationships with our clients and colleagues alike, helping communities and acting with purpose. Our client service, entrepreneurialism, and inquisitive nature sit at the heart of the firm, enabling us to prioritize client goals and achieve successful outcomes.

In a commentary posted on Oct. 24, 2023, on Law.com, the author warned of the danger posed by artificial intelligence (AI) displacing the judgment and experience of the trial lawyer...
United States Technology

This article discusses how artificial intelligence should not be relied upon to replace the judgment of a trial lawyer and how it can instead be used to assist diligent lawyers in considering alternative paths and sources of information to make the best possible decisions for clients.

In a commentary posted on Oct. 24, 2023, on Law.com, the author warned of the danger posed by artificial intelligence (AI) displacing the judgment and experience of the trial lawyer: "The introduction of AI into the core of a trial, where the stakes are high and justice hangs in the balance, can only be described as a danger to the legal profession." The trial in question was the criminal trial of the hip-hop artist Pras Michel, a member of the group Fugees. There, the convicted defendant Michel filed a motion for a new trial based on the ineffective assistance of counsel.

According to the commentary, Michel's trial counsel relied on an AI-generated argument as a crucial part of his closing argument. Citing the analysis from Politico on the motion for a new trial, Michel's new counsel contended that trial counsel's closing made "frivolous claims, misunderstanding the requisite elements, muddling the schemes, and willfully ignoring critical vulnerabilities in the government's case."

Without doubt, this warning is well-taken—but the core question has to focus on how the AI was used. It is hard to imagine that any lawyer would blindly adopt an AI-generated closing. AI may be helpful as a tool, but a trial lawyer's closing is personal. At the end of the case, the trial lawyer is synthesizing the entirety of the admitted evidence and talking directly to the jurors.

The Michel trial judge is allowing an evidentiary hearing to determine whether the criminal defendant had been denied effective assistance of counsel, a very high standard to meet. But even if this AI argument were raised by an unsuccessful litigant in a civil case, suing for malpractice, there would be multiple hurdles, including an inquiry into the platform used, the inputs, and the experience and diligence of the trial lawyer. Moreover, there would have to be some showing that the AI-portion of the closing was wrong, or harmful to the defense—and but for that insertion, the malpractice plaintiff would have prevailed.

The threshold question would have to be whether an AI-generated portion of the closing was blindly plugged into a closing or was reviewed and edited by the trial lawyer exercising his or her knowledge of the facts and law applicable in the case. In many jurisdictions there is an attorney-judgment rule protecting trial lawyers from malpractice if they chose one of several reasonable courses of action, even if the path chosen was not the best path. In other words, was AI "replacing" the lawyer, or was the lawyer using his or her best judgment with some technological assistance?

To put this AI issue in context, partners regularly rely on associates to conduct computerized legal research based on word searches in LexisNexis. But before the litigation partner would put a case or a quote from a case into a brief, he or she would want to read the case to make sure that the case is on point, helps the claim or defense, and does not introduce an idea that harms the case.

An ABA panel recently came to the same conclusion, summarized in one article as "Trusting a generative artificial intelligence program like ChatGPT to write legal briefs is like trusting a young law firm associate. Both need close supervision by more senior lawyers." Lawyers must take responsibility for "supervising" AI, ABA News, Sept. 25, 2023.

This point was made by Judge Castel in Mata v. Avianca, 1:22-cv-01461, (S.D.N.Y., June 22, 2023) in which the court imposed sanctions on the lawyers who submitted a brief with "fake cases." The decision, however, is not an indictment of the use of AI in litigation. Rather, the court expressly recognized that "if the matter had ended with respondents coming clean about their actions shortly after they received the defendant's March 15 brief questioning the existence of the cases, or after they reviewed the court's orders of April 11 and 12 requiring production of the cases, the record now would look quite different." The sanctions were imposed, not because of a mistake, but because the lawyers "continued to stand by the fake opinions after judicial orders called their existence into question."

As far as the broader issue of using AI as a tool, the Mata court noted that this was an acceptable use of a new technology to aid good lawyers: "In researching and drafting court submissions, good lawyers appropriately obtain assistance from junior lawyers, law students, contract lawyers, legal encyclopedias and databases such as Westlaw and LexisNexis. Technological advances are commonplace and there is nothing inherently improper about using a reliable artificial intelligence tool for assistance."

AI should not be relied upon to replace the judgment of a trial lawyer, but it can be used to assist the diligent lawyer in considering alternative paths and sources of information to make the best decisions for his or her client. In this competitive world, clients demand—and deserve—efficiency from their lawyers, including the appropriate use of emerging technologies. AI is not a substitute for an experienced trial lawyer; but it should be one of the arrows in his or her quiver that can be effectively employed, along with experience, skill, instinct and judgment.

Previously published in the New York Law Journal

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More