AI evidence in jury trials: Authenticity, admissibility, and roles of the court and juries
As artificial intelligence (AI) technologies become more prevalent, state courts are likely to encounter AI-generated evidence, such as deepfakes, reconstructed videos, and synthetic documents. In this webinar, the TRI/NCSC AI Policy Consortium explores the evidentiary challenges posed by both acknowledged and unacknowledged AI-generated evidence in jury trials. Panelists examine existing legal frameworks for authentication and admissibility and provide practical guidance for judges and attorneys. Special attention is paid to the unique responsibilities of courts and juries in evaluating the authenticity and reliability of AI-generated evidence, as well as the potential need for updated jury instructions and judicial guidance.
Learning objectives:
1. Identify the distinctions between acknowledged (disclosed) and unacknowledged (potentially deceptive) AI-generated evidence and recognize examples (e.g., deepfake audio/video, synthetic documents) that may be presented to a jury.
2. Understand the existing rules of evidence regarding authentication (e.g., FRE 901, 902, 104) and explain how these rules apply to AI-generated evidence, including the role of extrinsic evidence and the allocation of responsibility between judges and juries.
3. Assess the challenges and risks AI evidence presents to juries and how the existence of AI evidence may affect jury instructions.
Moderator:
- Megan Carpenter, dean and professor of law, University of New Hampshire Franklin Pierce School of Law
Panelists:
- Maura R. Grossman, research professor and eDiscovery lawyer/consultant/expert/special master
- Jawwaad Johnson, director of the Center of Jury Studies and principal court management consultant, NCSC
- Judge Erica Yew, Santa Clara County (Calif.) Superior Court
Explore more
AI in criminal cases: Courts' role in preserving constitutional rights
Learn how AI is being used in the criminal justice system and what guardrails are necessary to preserve Fourth Amendment rights, due process, and equal protection. Discover practical approaches for courts to authenticate AI evidence, assess its reliability, and ensure that technological capabilities do not erode fundamental rights.
AI tools, self-represented litigants & the future of access to justice
Learn how courts and legal aid organizations are addressing critical questions about how self-represented litigants can use AI tools in constructive ways.
Evaluating acknowledged AI-generated evidence
Guidance on questions or areas of inquiry a trial court may consider using to help inform a determination whether to admit acknowledged AI-generated evidence.