Artificial Intelligence (“AI”) made legal and mainstream news in 2023. In a highly publicized and widely discussed case, Mata v. Avianca, Inc., the United States District Court for the Southern District of New York sanctioned attorneys for citing to non-existent, fake cases generated by Open AI’s ChatGPT. Despite Mata’s stark warning to the bar, AI-generated fake caselaw continues to appear in litigation nationwide.
In Matter of Samuel, the Kings County Surrogate’s Court confronted a lawyer’s careless use of AI in a contested probate proceeding. The objectant’s counsel submitted “fake caselaw resulting from Artificial Intelligence hallucinations” in reply papers submitted on a summary judgment motion. Five of the six cases cited in in the objectant’s reply papers were either erroneous or non-existent. The court held that counsel violated the rule against “frivolous” litigation under 12 NYCRR 130-1.1 by making material misstatements to the Court concerning case law.
Surrogate Graham was careful to point out that AI is not, in and of itself, the problem. While the court was “dubious” about attorneys using AI to prepare legal documents, it focused squarely on counsel’s failure to examine and scrutinize the ostensible authorities that AI cited in support of the objectant’s arguments. The court found that counsel had sufficient time to review and analyze the AI generated reply papers and conduct a simple cite check on reliable legal search engines, which would have revealed AI’s reliance on non-existent, fake caselaw. Counsel’s conduct, and not AI, was the real problem. Continue Reading Matter of Samuel – Artificial Intelligence Hallucinates and an Incapacitated Person Makes a Will