How to talk to your court users about AI
Who should read this?
- Court clerks
- Law librarians
- Self-help staff
Why this guidance matters
Court users are increasingly using artificial intelligence (AI) tools to seek information and advice to navigate their legal issues. This guidance provides a framework to discuss AI use with court users while emphasizing accuracy, responsibility, and compliance with court rules. Court staff should familiarize themselves with their policy on providing legal information vs. legal advice (also known as a safe harbor policy), and make sure to provide guidance according to that governing policy.
Tips for talking about AI
Welcome questions about AI
Treat questions about AI as an educational opportunity. Respond with curiosity to understand how they may be using AI for their legal issue.
Acknowledge AI as a powerful tool with limits
Frame AI as one tool among many. Like all tools, it is important to be careful how you use AI.
Explain why legal problems require extra caution when using AI
Handling a legal problem depends on precise, context-specific information. The information someone includes or leaves out, and the words used to prompt, can significantly affect results when using AI tools.
Educate court users on the limits & risks of AI use
Explain to court users that while AI tools can be powerful, they have clear limits. Help them understand that relying solely on AI to handle a legal matter is risky for several reasons.
Explain the rules
When applicable, point court users to the court's AI policy or standing order governing AI usage for the public, including whether disclosure about AI use is required or optional.
Guide court users to trusted resources
Guide court users toward trusted, validated, and relevant resources such as website content, forms, or jurisdiction specifics self-help chatbots.
Provide assistance
Provide assistance correcting or revising draft pleadings consistent with your role and jurisdiction's policy on providing legal information vs. legal advice.
Be a role model
Your example can help court users understand what safe AI use looks like. This includes not putting personally identifiable or confidential information into unapproved AI tools.
TRI/NCSC AI Policy Consortium for Law & Courts
An intensive examination of the impact of technologies such as generative AI (GenAI), large language models, and other emerging, and yet-to-be developed tools.
Explore more
Guidance for implementing AI in courts
These two guides provide practical steps for integrating AI tools safely, effectively and responsibly.
Principles & practices for AI use in courts
The National Center for State Courts (NCSC) provides various resources and guides for court management, ranging from implementing AI and cybersecurity basics to improving civil justice accessibility and expediting family cases through the use of triage tools.
AI in state courts
Explore updated guidance on AI in state courts, including court orders, rules, policies, and an interactive map.