Federal Rule 901 governs the authentication of evidence in court. Per the rule, “[t]o satisfy the requirement of authenticating or identifying an item of evidence, the proponent must produce evidence sufficient to support a finding that the item is what the proponent claims it is.” Historically, this requirement could be satisfied, for example, through the testimony of a witness with knowledge, comparison with an authenticated item by an expert witness or trier of fact, or identification of distinctive characteristics. The advent of deepfakes, however, has generated debate whether additional safeguards need to be implemented to protect the authenticity of evidence.

On May 2, 2025, the U.S. Judicial Conference’s Advisory Committee on Evidence Rules considered proposals to amend the Federal Rules of Evidence to address the challenges posed by AI-generated evidence (see our prior post regarding the Committee’s proposed Rule 707 – Machine-Generated Evidence). Besides Rule 707, the Committee evaluated Rule 901(c), a new draft amendment that addresses deepfakes, i.e., altered or wholly fabricated AI-generated images, audio, or video that are difficult to discern from reality.

While recognizing the importance of detecting deepfakes to preserve the integrity of the judicial system, the Committee ultimately decided that a rule amendment was not necessary at this time, given the courts’ existing methods for evaluating authenticity and the limited instances of deepfakes in the courtroom to date. Nonetheless, as a precaution, the Committee proposed Rule 901(c) for future consideration should circumstances change.

Rule 901. Authenticating of Identifying Evidence

*****

(c) Potentially Fabricated Evidence Created by Artificial Intelligence.

In the notes section to the draft amendment, the Committee explained its rationalefor Rule 901(c)’s two-step process for evaluating deepfake claims. First, the opponent must submit sufficient information for a reasonable person to infer that the proffered evidence was fabricated. A mere assertion that the evidence is a deepfake is not sufficient. Provided that the opponent meets their burden, the Committee explained that the proponent “must prove authenticity under a higher evidentiary standard than the prima facie standard ordinarily applied under Rule 901.”

The Committee acknowledged that Rule 901(c) does not specifically combat another possible consequence of deepfakes, whereby the risk of deepfakes leads juries to distrust genuine evidence. The Committee, however, cited Rule 403 (the “prejudice rule”) and the judges’ role as gatekeepers to curb attorney assertions that “you canno t believe anything that your see.”

Clearly, deepfakes present significant challenges in the courtroom and risk eroding public confidence in our judicial system. LMS will continue to monitor this evolving topic, the tools used by judges to verify evidence authenticity, and any associated amendments to the rules of evidence.

Leave a Reply

Your email address will not be published. Required fields are marked *