The privilege analysis doesn't change based on who's typing. It changes based on where the data goes.
US v. Heppner, 25 Cr. 503 (S.D.N.Y. Feb. 10, 2026)
Protect Your PracticeTwo days ago, Judge Rakoff granted a motion that should make every attorney using cloud-based legal AI sit up and pay attention.
The government argued that documents a defendant generated through Claude weren't protected by attorney-client privilege or work product doctrine. Judge Rakoff agreed.
Government files motion arguing AI-generated documents aren't privileged
Judge Rakoff grants motion
Legal tech community realizes the implications
The reasoning was straightforward and that's what makes it dangerous:
No confidentiality. Anthropic's privacy policy permits collection of prompts and outputs, use for training, and disclosure to governmental authorities. The defendant voluntarily shared information with a platform whose own terms allow government access.
Retroactive privilege fails. Sending pre-existing non-privileged documents to counsel after the fact doesn't make them privileged.
Work product doesn't apply. The defendant's attorney didn't direct him to use Claude. Self-directed AI research isn't protected.
Yes. And the government's argument had nothing to do with that.
Read the motion again. The analysis turned on two things:
That analysis doesn't change if it's an attorney doing the prompting. The platform's terms are the same regardless of who's sitting at the keyboard.
"Every single discovery request should now be seeking non-privileged AI usage."
"If privilege analysis turns on vendor data retention and disclosure rights, does this implicate every legal AI platform operating as a Remote Computing Service under the SCA? Potentially, yes."
The key variable isn't "AI." It's whether the system functions as a third-party repository with independent rights over the data.
"Are you using any cloud-based AI tools that could make our communications discoverable?"
After Heppner, sophisticated clients and opposing counsel will start asking. Corporate legal departments will add it to their outside counsel guidelines. Malpractice carriers will want to know.
What's your answer going to be?
Pull up every legal AI tool you use. Find the privacy policy. Search for "training," "government," and "disclosure." If any of those words appear with permissive language, you have a Heppner problem.
Your data. Your hardware. Your control. No vendor ToS. No third-party training. No silent disclosures.
Start Your Free TrialMac-native. Local-first. Actually private.