New Question for Expert Witness: Who Drafted This Report, You Or Your Machine?

Zuckerman Spaeder LLP
Contact

Zuckerman Spaeder LLP

Artificial Intelligence

A federal judge in Minnesota recently granted a motion to exclude an expert declaration explaining the dangers of AI deepfakes because the declaration itself contained AI-hallucinated citations.1 The case was a First Amendment challenge to a Minnesota statute prohibiting deepfakes with intent to influence elections, and the State tendered the expert’s declaration in defense of the statute. The judge noted “[t]he irony” in opposing AI-generated deepfakes with a declaration that contained AI-driven hallucinations, but she prudently rested her decision on the inability to trust an expert declaration submitted under penalty of perjury that was not adequately reviewed by the expert or the counsel who submitted it. The judge “add[ed her] voice to a growing chorus of courts around the country declaring the same message: verify AI-generated content in legal submissions!”2

That’s good advice as far as it goes,3 but a different issue emerges from a review of the expert’s supplemental declaration explaining in detail how his use of GPT-4o, a “generative AI tool referred to as a large language model,” led to the erroneous citations. That explanation revealed that portions of the report were drafted, at least in the first instance, by GPT-40.4

The expert explained that he uses GPT-4o in generating research leads, in summarizing some relevant sources, and in drafting individual paragraphs of the declaration.5 After reviewing the research materials identified by GPT-4o and other techniques, he jots down bullet-point thoughts to include in the declaration. To draft the individual paragraphs, he feeds the substance of the bullet points to GPT-4o; e.g., “draft a short paragraph based on the following points: -deepfake videos are more likely to be believed, -they draw on multiple senses, - public figures depicted as doing/saying things they did not would exploit cognitive biases to believe video [cite].”6 It’s not clear whether AI’s contribution to the drafting was material to the final declaration, but let’s assume it was.

The Federal Rules of Civil Procedure generally require that a party designating an expert witness produce a written report that is “prepared and signed by the witness.”7 A frequent line of questioning in expert depositions focuses on whether testifying experts actually drafted their reports. Experts inevitably answer that they stand behind every word in the report, but often enough they acknowledge that staff members or counsel for the sponsoring party contributed substantively to the drafting. Although practices vary by jurisdiction, this line of questioning is often permitted in federal courts to explore the reliability of the expert’s opinion.8 The cases generally hold that an expert’s opinion and report will not be stricken so long as the expert “substantially participated” in preparing the report,9 with some decisions suggesting more is required.10 Although relatively few expert opinions are stricken altogether for “ghostwriting” transgressions, most courts seem to permit cross-examination of the expert with evidence that others participated in drafting the report, and they allow the finder-of-fact to consider that evidence in determining the weight to give the expert’s opinion.11

It is not yet clear whether courts will treat AI-generated expert reports similar to those drafted in whole or in part by counsel. One could argue that AI-drafted or AI-assisted expert reports are still “prepared” by the expert, at least so long as the expert drafted the AI prompts and reviewed the outputs. Or one could argue that AI-drafted expert reports are even more problematic than lawyer-drafted reports because AI’s reasoning is opaque at best and fanciful at worst, as the hallucinated citations seem to reflect.

Courts and ethics authorities have been cogitating over whether to require disclosure of AI use by lawyers in preparing briefs and other filed papers. Quite a few judges (through standing orders) and some courts (through local rules) require disclosure of AI usage in preparing papers filed with the court.12 The proliferation of these disclosure requirements may have plateaued after the Court of Appeals for the Fifth Circuit declined to adopt a rule requiring disclosure. And ethics opinions have not reached consensus on whether disclosure is required as part of the duty of candor or some other rule of professional conduct, although all agree that lawyer verification of AI-outputs is required.13

How a lawyer prepared a brief is typically not a fair topic for discovery. That’s not the case for testifying experts who are required to prepare reports. And an expert’s report is supposed to reflect the true opinion of the expert, presumably formulated after personally reasoning through reliable evidence, whereas a lawyer’s argument does not necessarily reflect the lawyer’s true opinion. That distinction suggests disclosure of AI usage is more important for expert reports than for legal briefs and memoranda.

To be clear, it’s too early to predict how courts will treat the use of AI in drafting expert reports. At this point, however, lawyers need to consider whether to advise their own experts to avoid AI when drafting an expert report. It’s only a matter of time before someone tries to argue in closing that an expert’s testimony is unreliable because the expert’s opinion was dictated by a machine. And on the other side of the equation, lawyers should consider questioning an adverse expert on whether AI played a material role in preparing the expert’s report.

1 Kohls v. Ellison, No. 24-cv-3754, 2025 WL 66514 (D. Minn. Jan. 10, 2025).
2 Id. at *4; see also 2025 WL 66765 (D. Minn. Jan. 10, 2025) (denying motion for preliminary injunction on nonsubstantive grounds, without considering expert declaration).
3 Verification should mean not just confirming that the citations are real, but that the expert reviewed the material cited and made a genuine determination that the citation should be included.
4 See Kohls v. Ellison, D. Minn. Case No. 24-cv-03754-LMP-DLM, ECF No. 39 (Nov. 27, 2024) [hereafter, “Declaration”].
5 Declaration, supra n.4, at ¶¶ 8-9.
6 Declaration, supra n.4, at ¶ 11. The hallucination error evidently arose from including “[cite],” which was meant as a reminder to the professor to insert the correct citation, but which the AI program read as an instruction to generate a citation. Id. at ¶ 12.
7 Fed. R. Civ. P. 26(a)(2)(B).
8 See generally Fed. R. Civ. P. 26(b)(4)(C) (outlining work product protection for certain communications between party’s attorney and expert witness); Valley View Dev., Inc. v. United States, 721 F. Supp. 2d 1024, 1049 (D. Colo. 2010) (expert was sufficiently involved in drafting report to allow testimony, subject to cross-examination); Government Employees Ins. Co. v. Right Spinal Clinic, Inc., 608 F. Supp. 3d 1184 (M.D. Fla. 2022).
9 Government Employees, 608 F. Supp. 3d at 1187-88 (excluding expert’s testimony under substantial participation standard); Bekaert Corp. v. City of Dyersburg, 256 F.R.D. 573 (W.D. Tenn. 2009) (same); see also Fed. R. Civ. P. 26 Advisory Committee Notes (1993) (“Rule 26(a)(2)(B) does not preclude counsel from providing assistance to experts in preparing the reports, and indeed, with experts such as automobile mechanics, this assistance may be needed. Nevertheless, the report, which is intended to set forth the substance of the direct examination, should be written in a manner that reflects the testimony
to be given by the witness and it must be signed by the witness.”); United States v. Kalymon, 541 F.3d 624, 638 (6th Cir. 2008) (citing Advisory Committee Note).
10 Government Employees, 608 F. Supp. 3d at 1188; see also Numatics, Inc. v. Balluff, Inc., 66 F. Supp. 3d 934, 944-45 (E.D. Mich. 2014) (criticizing counsel-drafted expert reports and ultimately barring expert’s testimony on that and other grounds).
11 See inMusic Brands, Inc. v. Roland Corporation, No. 17-00010-MSM, 2023 WL 3944900 at *2 (D.R.I. June 12, 2023), report and recommendation adopted, 2024 WL 1341124 (D.R.I. Mar. 29, 2024); Wells v. BNSF Ry. Co., No. CV-21-97-GF-BMM, 2023 WL 7844690 *4 (D. Mont. Nov. 15, 2023); Berkely*IEOR v. Teradata Ops., Inc., No. 17 C 7472 2024 WL 5075416, at *5 (N.D. Ill. Jan. 25, 2024).
12 See, e.g., https://www.law360.com/pulse/ai-tracker (tracking orders and rules).
13 See, e.g., ABA Comm. on Ethics Formal Op. 512, Generative Artificial Intelligence Tools, at 10 (July 29, 2024); D.C. Comm. Ethics Op. 388, Attorneys’ Use of Generative Artificial Intelligence in Client Matters (April 2024).

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Zuckerman Spaeder LLP 2025

Written by:

Zuckerman Spaeder LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Zuckerman Spaeder LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide