Recent court cases show asking an AI chatbot questions about your legal case could allow opposing parties to see your confidential trial strategy.
In most court cases, both criminal and civil, litigants have some obligations to share evidence in their possession with the opposing parties. Common exceptions include communications between clients and attorneys and materials that represent those attorneys' work product on a case. In a recent federal criminal case in New York, defendant Bradley Heppner posed questions about his case to the AI platform Claude and saved a number of documents detailing the exchanges that federal agents executing a search warrant later seized. His attorneys argued the documents should be confidential because Heppner's prompts included information they conveyed to Heppner. In February, a federal judge allowed prosecutors to access the Claude documents. The court ruled that neither attorney-client nor work-product privilege applied because Heppner's attorneys never told him to submit information about his case to Claude, and because Claude's terms of service explicitly allow its maker, AI company Anthropic, to collect data on the tool's inputs and outputs. "Even assuming that Heppner intended to share these communications with his counsel and eventually did so, it is black-letter law that non-privileged communications are not somehow alchemically changed into privileged ones upon being shared with counsel," U.S. District Judge Jed Rakoff wrote. The Heppner ruling appears to be the first to deal with what Coe said may be a common scenario: clients represented by counsel inputting private information from their attorneys to a third-party AI tool. While many law firms, including Dentons, employ enterprise-level AI services designed to be safe for legal use, Coe hasn't seen platforms yet that allow clients to log on with their attorneys and contribute. "Everything at this point is discoverable if you are not an attorney using a secure AI platform," Coe said. "But for a non-lawyer, there is not any protection."
Not every judge will reach the same conclusion. In a Michigan federal case, a judge in February declined to let a litigant demand access to the other party's AI chatbot communications in discovery. Significantly, in that case, the party using the AI tool did not have an attorney and was self-represented, known as pro se in legal terminology, Perez said. "That seems like the big difference (to the judge)," she said. "Because it was a pro se plaintiff, the court interpreted the mental impressions of an attorney prepared in anticipation of litigation (as work product) and interpreted the pro se litigant as being an attorney." But Perez cautioned that the lesson to take away should not be that anyone representing themselves can use AI tools with confidence that their messages will remain confidential. "I think that will be interesting to see if other courts also interpret the work product doctrine with a pro se litigant in this same way," she said, adding that "I think there is a close question about waiver here and whether or not, by putting this information into a generative AI tool, you're waiving that protection."
Kathleen Law, president of the Iowa State Bar Association, said the profession is working to adapt to the impact of AI tools. At her suggestion, the association has launched an AI task force, which held its first meeting on March 2, and hopes to produce a report and recommendations later this year. "Certainly (there are) ethical concerns of attorneys using AI and concerns that we need to bring up to our clients to advise our clients properly about their use of AI, which is in this (Heppner) case," she said. "We've been tasked with certain things to look at: different technologies, ethical considerations that we have to go by, how it can be used properly, and advice to attorneys to use that in the legal system, how it can be correctly and incorrectly used." Those recommendations could range from producing brochures, websites and educational materials for members of the public to recommending changes to Iowa court rules. For now, though, Law said, anyone involved in court proceedings should tread very carefully when involving AI tools. "They may not realize that something could be considered confidential or attorney-client privilege, or if they're disclosing a certain fact, they may not even be aware that could hurt their case in court," she said. "But at the same time, if someone can't afford an attorney, understandable if someone might think, 'Well, Claude's answer is better than nothing.' I think that's what's very troubling."