Categories
Legal Tech & AI

BigLaw Gets AI Efficiency. Their Clients Get Higher Bills.

Law firms are using AI to work faster. Their clients are getting charged the same, or more. Logic tells us this should work the other way around. This broken equation only works for the one sending the invoice.

So what’s going on?


This week, Richard Tromans at Artificial Lawyer reported something that should make every small firm owner sit up straight.

At a conference in Stockholm, a senior in-house lawyer said it plainly: despite all the press releases about law firms adopting AI, “We get nothing. They haven’t changed and probably next year the same work will cost even more.”

The GC went on to say they’d probably need to have “the conversation” with their outside counsel this year. The conversation where they ask: if you’re using AI to do this work faster, why am I paying the same hourly rate?

The answer, of course, is simple: because they can.


The BigLaw AI Arbitrage

Here’s what’s actually happening at large law firms:

  1. They buy expensive AI tools
  2. They write press releases about “innovation” and “efficiency”
  3. Associates use the tools to do work faster
  4. The firm pockets the efficiency gains
  5. Client bills stay the same (or go up)

This isn’t a conspiracy. It’s just business. Law firms aren’t charities. If they can do the same work in half the time but charge the same amount, they will. The billable hour model practically demands it.

Why would a firm reduce fees when the client has already accepted the price? Why would they pass along savings when they can keep them as profit?

The answer is: they won’t. Not until clients force them to.


The Small Firm Advantage Nobody’s Talking About

Here’s what makes this story interesting for attorneys running their own practices:

When you adopt AI in a small firm, YOU get the efficiency gains.

There’s no partner committee deciding whether to pass the savings along. There’s no billionaire-funded PE firm demanding year-over-year revenue growth. There’s just you, doing better work in less time, and deciding what to do with those extra hours.

You could:

  • Take on more clients without burning out
  • Offer more competitive fixed-fee pricing
  • Spend more time on the complex work that actually requires a lawyer
  • Go home at 5pm for once

The same technology that BigLaw uses to pad margins, small firms can use to outcompete on price while maintaining quality.

That’s a structural advantage that didn’t exist five years ago.


The Sea Change Is Coming

What Tromans reported from Stockholm wasn’t a one-off complaint. It was a preview of what’s about to happen across the industry.

In-house legal teams are using AI now. Real usage, not pilot programs. They’re seeing firsthand what can be done. The same contract review that used to take a week now takes a day. The same research memo that justified twenty hours of associate time now takes two.

And they’re asking: if we can do this, why can’t our outside counsel? And if they can, why aren’t we seeing it in the bills?

The quote that stuck with me: “If they were all part of a single law firm, then that law firm would no doubt receive a prize for being a world leader in legal innovation.”

That was describing in-house teams, not law firms.

The buyers are becoming more sophisticated than the sellers. That never ends well for the sellers.


Position Yourself Now

If you’re a small firm or solo practitioner, this is your window.

While BigLaw is playing defense, trying to justify why AI efficiency shouldn’t translate to lower bills, you can play offense. You can build your practice around the new economics:

  • Fixed fees that work because your actual time investment is reasonable
  • Faster turnaround that makes clients feel prioritized
  • Competitive pricing against larger firms (who can’t match you without cannibalizing their own model)

The corporate clients complaining in Stockholm aren’t your clients. But the small business owners, the individuals, the startups who’ve been priced out of quality legal help? They’re watching this unfold too.

And they’re looking for alternatives.

We built a complete blueprint for this. Practice area pricing templates, AI efficiency math, flat fee packages with real market rates, and a step-by-step strategy for building the kind of firm that makes BigLaw irrelevant. Read the full Future-Proof Law Firm guide →


60-Second Firm Hack: The “What Would AI Do?” Audit

Pick your three most common matter types. For each one, write down:

  1. The tasks that take the most time
  2. Which of those tasks are repetitive or templatable
  3. What would change if those tasks took 10% of the current time

That third question is where the magic is. If contract review takes 10% of the time, do you charge less? Take on more clients? Bundle it into a fixed fee that feels like a steal?

The firms that answer that question first will own the next decade.


Off the Record

TimeNet Law wasn’t built for BigLaw economics. It was built for attorneys who actually want to run efficient practices, and keep the benefits.

We’re not trying to help you bill more hours. We’re trying to help you bill smarter hours, track them accurately, and get paid faster. The efficiency gains from good practice management software should flow to you, not to some PE-backed vendor’s quarterly earnings.

That’s been our philosophy for over 20 years. Nice to see the rest of the industry catching up to why it matters.

See what efficient practice management looks like →


The billable hour rewards inefficiency. AI exposes that. What you do with that information depends on which side of the invoice you’re on.

Ready to flip the equation? Build Your Future-Proof Firm →

Categories
Industry Analysis Legal Tech & AI

The Legal Tech Ground Is Shifting. Here’s What You Need to Know.

Last week, Anthropic launched a legal plugin for Claude. Legal tech stocks cratered. Meanwhile, 8am is stitching together another Frankenstein’s monster of practice management tools. If you’re feeling a little dizzy watching all this, you’re paying attention.

It’s been a wild few weeks in legal tech. And if you’re an attorney just trying to run your practice without getting caught in the crossfire, the news probably feels exhausting. Let me break down what actually matters.

The Claude Bomb

Anthropic, the company behind the Claude AI platform, just dropped a legal plugin that lets in-house counsel automate contract review, NDA triage, and compliance workflows. When they announced it, Thomson Reuters, RELX, and Wolters Kluwer stocks plummeted.

The market reaction tells you everything. For years, legal tech vendors have been wrapping foundation AI models and selling them back to you with a markup. Now the foundation model companies are cutting out the middleman. They’re going straight to the enterprise with pre-built workflows that do exactly what $50,000/year platforms do.

Is this the death of legal tech? No. But it’s a signal. The vendors who built their entire value proposition around “we’ll put AI on top of your contracts” are suddenly looking very exposed. The ones with actual proprietary data and deep subject matter expertise will survive. The ones who were just playing markup arbitrage? Not so much.

The 8am Consolidation Machine

Meanwhile, the company formerly known as AffiniPay (now rebranded as “8am”) continues its shopping spree. They already own LawPay, MyCase, CasePeer, and DocketWise. Now they’re expanding LawPay into a “complete financial management solution” that combines payments, invoicing, time tracking, expense management, and reporting.

On paper, this sounds great. One platform! Everything integrated!

In reality, you know how this works. Consolidation means different codebases stitched together by acquisition. Different teams who’ve never worked together. Different philosophies about what attorneys actually need. And eventually, inevitably, price increases to pay for all that M&A activity.

The press release uses phrases like “financial complexity and cash flow constraints have become serious operational risks for law firms.” Translation: we bought a bunch of companies and need to justify the integration costs to our investors.

What This Actually Means for Your Practice

Here’s the uncomfortable truth: most legal tech is built for investors, not attorneys. The VC playbook is simple. Buy up competitors. Raise prices. Cut support costs. Extract maximum value before the next exit.

You’ve seen this movie before. Clio’s price hikes. The endless consolidation in the practice management space. The slow degradation of support as companies scale. The features that used to be included becoming “premium add-ons.”

The AI disruption makes this even messier. Companies that spent millions acquiring AI wrappers are now watching foundation models undercut them. They’ll respond the only way they know how: raising prices on existing customers to protect margins. Meanwhile, attorneys keep paying rent on software they should own.

The Alternative Nobody Talks About

There’s another way to build legal software. You build something good. You support it directly. You don’t sell to private equity. You don’t chase growth at all costs. You just make something that works and charge a fair price for it.

It sounds almost quaint in 2026. But it’s the model TimeNet Law has followed for twenty years. Same owner. Same developer. Same phone number when you need help.

No investor pressure to raise prices. No integration chaos from acquisition sprees. No wondering whether your software will exist in its current form next year. Just software that does what it’s supposed to do, built by someone who actually answers support calls.

That’s not a sales pitch. It’s just how things should work.

⚡ 60-Second Firm Hack: The Monday Morning Client Pulse

Before you open email Monday morning, spend 60 seconds scanning your open matters. Pick three clients you haven’t heard from in two weeks. Send each a one-line email: “Just checking in. Anything you need from me this week?”

Three emails. 60 seconds. You’ll be amazed how often this simple touchpoint uncovers forgotten questions, prevented scope creep, or simply reminded a client that you’re thinking about their matter.

The best firms don’t wait for clients to reach out. They stay one step ahead.


The legal tech landscape is going to keep shifting. AI will keep disrupting. Consolidation will continue. Prices will rise. Support will get worse at companies chasing scale.

Your job isn’t to predict all of it. Your job is to pick tools built by people who share your values, who will still be here in five years, and who won’t hold your data hostage when you need to move on.

That’s not complicated. It’s just rare.


Want the Inside Track?

The tips in this post are just the beginning. Sunday Brief is my private newsletter where attorneys get the must-have tips, secrets, and news that don’t make it to the blog.

No fluff. No sales pitches. Just the insider knowledge that helps you run a better firm.


Sign up to get more straight talk about legal tech, billing, and building a practice that actually works.

Categories
Legal Tech & AI Privacy & Security

Microsoft Copilot Read Your Confidential Emails for a Month. Lawyers Should Be Paying Attention.

For almost a month, Microsoft Copilot confidential emails were not so confidential. Microsoft’s AI assistant was reading and summarizing emails marked “confidential” before anyone noticed. If your law firm uses Microsoft 365, you should be paying very close attention right now.


On February 18, Bleeping Computer reported that Microsoft 365 Copilot Chat had been quietly summarizing confidential emails since January 21. Not just regular emails. Emails with sensitivity labels applied. Emails protected by data loss prevention (DLP) policies that were explicitly configured to prevent exactly this from happening.

Microsoft confirmed it. Their own service alert (tracked as CW1226324) stated that “users’ email messages with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat.”

The bug affected the Copilot “work tab” chat feature, which was pulling content from users’ Sent Items and Drafts folders and summarizing it on demand, regardless of whether those messages were supposed to be locked down.

For almost a month. In silence.


How the Microsoft Copilot Confidential Emails Bug Affects Law Firms

Let’s be direct about what happened here.

If your law firm runs Microsoft 365 with Copilot Chat enabled, and you had confidential client communications sitting in your Sent Items or Drafts folders (which of course you did), Microsoft’s AI may have been reading and summarizing those communications. Even if you did everything right. Even if you applied sensitivity labels. Even if you configured DLP policies to prevent automated access.

Your controls were bypassed by a “code issue.”

Microsoft’s official response? “This did not provide anyone access to information they weren’t already authorized to see.”

That’s technically true, and it completely misses the point. The concern isn’t that a stranger accessed the emails. The concern is that an AI system ingested, processed, and summarized privileged communications that were explicitly marked as off-limits. Content that was supposed to be invisible to automated systems was being actively read, analyzed, and presented in chat summaries.

For attorneys, this isn’t a minor configuration hiccup. This is a potential breach of the duty of confidentiality.


The Privilege Problem Nobody Is Talking About

Here’s where it gets really uncomfortable.

Just eight days before the Copilot bug was publicly reported, a federal judge in United States v. Heppner ruled that AI is not your co-counsel when it comes to attorney-client privilege. The court held that sharing information with consumer-grade AI tools can destroy privilege entirely, because those tools are third-party services with no confidentiality obligation.

Now combine that with what Microsoft just admitted.

You applied confidentiality labels to your emails. You set up DLP policies. You did what Microsoft told you to do to keep privileged content away from AI. And Microsoft’s own AI read it anyway. For weeks.

The Heppner decision says sharing privileged information with AI can waive privilege. Microsoft’s bug means privileged information may have been shared with AI without your knowledge or consent.

Ask yourself: if opposing counsel in active litigation discovered that your firm’s privileged communications had been processed by Microsoft’s AI for a month, what motion do you think they’d file?


The NHS Was Affected. The European Parliament Pulled the Plug.

This wasn’t some niche edge case affecting a handful of users.

The BBC reported that the bug was logged on the NHS’s internal IT support dashboard in England. The same week, the European Parliament’s IT department disabled built-in AI features on staff devices entirely, citing concerns that AI tools could transmit confidential data to external cloud servers.

Two of the world’s most security-conscious organizations either got burned or decided the risk wasn’t worth taking.

Meanwhile, Microsoft hasn’t disclosed how many organizations were affected. They described the incident as an “advisory,” a classification typically used for issues with “limited scope or impact.” They have not provided a final timeline for full remediation.


The Experts Are Not Sugarcoating It

Nader Henein, a data protection and AI governance analyst at Gartner, told the BBC this kind of failure is “unavoidable” given the speed at which companies push new AI features to market.

“Under normal circumstances, organisations would simply switch off the feature and wait till governance caught up. Unfortunately the amount of pressure caused by the torrent of unsubstantiated AI hype makes that near-impossible.”

Dr. Ilia Kolochenko, CEO of ImmuniWeb and a Fellow at the European Law Institute, was even more blunt in his assessment to Cybernews:

“With the rapid proliferation of Agentic AI and AI-powered plugins for traditional software, incidents like this one will likely surge in 2026, possibly becoming the most frequent type of security incident at both large and small companies around the globe.”

Professor Alan Woodward of the University of Surrey called it a lesson in why AI tools must be “private-by-design” from the start, not patched after the damage is done.

And here’s the line that should keep every managing partner up at night, from Dr. Kolochenko:

“Every day, tons of sensitive personal data are shared with LLMs around the globe without any precautions. Even governmental agencies of developed countries are exposed to this risk because of inadequate or simply missing governance of AI at workplace.”


A Pattern, Not an Incident

If you’ve been following this blog, this story should sound familiar.

Two weeks ago, we published our investigation into the law firm data broker pipeline, documenting how legal tech SaaS platforms funnel attorney data to third-party brokers. Last week, we covered how Claude AI hallucinated an entire lease agreement using fragments of real data from its training set.

And now Microsoft’s own enterprise AI is bypassing the very security controls it was designed to respect.

This isn’t a series of unrelated incidents. This is a pattern. The legal tech stack that law firms depend on is leaking from every direction: through data brokers, through AI hallucinations, and now through the tools that are supposed to protect your confidential communications in the first place.


What Your Firm Should Do About Microsoft Copilot Confidential Emails

If your firm uses Microsoft 365 with Copilot Chat enabled:

  1. Verify the patch is deployed. Microsoft says a configuration update has been pushed worldwide, but they also said the rollout is still “in progress” for some “complex service environments.” Don’t assume you’re covered. Confirm it.
  2. Audit what Copilot accessed. Determine which users had Copilot Chat active during the January 21 to mid-February window. Identify any confidential or privileged communications that may have been processed.
  3. Review your DLP policies. If your data loss prevention rules didn’t stop an AI tool from reading labeled content, you need to understand why and what else might slip through.
  4. Assess your ethical obligations. Depending on your jurisdiction, you may have disclosure requirements when privileged client communications are potentially compromised. Talk to your ethics counsel.
  5. Reconsider the AI defaults. The European Parliament disabled AI features entirely until governance catches up. That’s not paranoia. That’s prudent risk management. Better yet, consider tools that run entirely on your Mac, free from cloud dependency.

The Bottom Line on Microsoft Copilot Confidential Emails

Microsoft wants you to feel reassured. The bug is fixed. Access controls were intact. Nobody saw anything they weren’t supposed to.

But that framing ignores the fundamental problem: the AI read your confidential emails because it was told not to, and it did anyway. The controls you were promised would work, didn’t. For almost a month.

In a profession built on confidentiality, “oops, the AI read your privileged emails” is not a minor software bug. It’s a crisis of trust in the tools we’ve been told are safe to use. When you don’t own your software, you’re at the mercy of whoever does.

And based on what every expert quoted in this story is saying, this won’t be the last time it happens.


Have questions about how AI tools interact with your firm’s confidential data? Get in touch. We’re tracking every major AI security incident affecting law firms and publishing what we find.

Categories
Legal Tech & AI Privacy & Security

Claude Just Hallucinated a Complete Lease Agreement With Real Names and Addresses. Lawyers Are Freaking Out.

A Reddit post went viral this week when an attorney claimed Claude AI generated a complete commercial lease, with a real company, real address, and real contact information. What happened next should concern every lawyer using cloud-based AI.


Two days ago, a post on Reddit’s r/ClaudeAI forum hit 3,600 upvotes and 216 comments. The title:

“Claude just gave me access to another user’s legal documents”

Here’s what happened.

A user asked Claude Cowork, Anthropic’s new AI agent that reads and edits files on your computer, to summarize a document they’d uploaded. Instead of summarizing their document, Claude started describing a completely unrelated legal document. A commercial lease agreement.

Curious, the user asked Claude to generate a PDF of this mystery document.

Claude obliged. It produced a complete commercial lease agreement between “Commercial Properties, LLC” (Landlord) and “Collective, LLC” (Tenant) for a property in Blue Hill, Maine. Dated March 15, 2025. With contact information for the property management company.

The user did what any reasonable person would do: they called the property management company.

The company was real. The address was real. The contact information worked.

But the people named in the contract? The company seemed “confused” about them. And the attorney referenced in the document? Doesn’t appear to exist.


So What Actually Happened?

After 216 comments of debate, the consensus is clear: this was a high-fidelity hallucination.

Claude didn’t “leak” another user’s document. It did something arguably more unsettling. It mashed together fragments of real information (a real company name, a real Maine address, real contact details) with fabricated names, a nonexistent attorney, and invented lease terms. Then it presented the whole thing as a coherent, professional legal document.

As one commenter put it:

“It read their legal documents during the pre-training phase, probably cause they were public on the internet. Then Claude made up portions of the rest.”

A Hacker News commenter offered another theory: the property management company likely had an improperly configured cloud storage bucket that exposed a directory of leases. Those documents got scraped, ingested into AI training data, and now live inside the model, ready to be reassembled into something that looks authentic but isn’t quite real.

The Reddit moderator bot’s summary nailed it:

“Claude is scarily good at generating realistic-looking documents by mashing up info from its vast training data (i.e., the public internet). The fact that the attorney in the document doesn’t exist is pretty much the nail in the coffin for the data leak theory.”

Another user reported the exact same phenomenon: they uploaded a work document, and Claude started describing a completely unrelated fitness training plan, with specific details about someone else’s workout routine.


Why This Should Terrify Every Attorney Using Cloud AI

Let me be direct about what this means for lawyers.

1. Your Documents May Already Be Training Data

That commercial lease from Blue Hill, Maine didn’t materialize from thin air. Real company information ended up inside Claude’s training data. Whether it was scraped from a misconfigured server, indexed from a public webpage, or harvested through some other vector, the result is the same.

Real legal documents, with real names and real addresses, are inside these AI models.

Now think about your own practice. How many of your documents have touched cloud services? How many have been uploaded to AI tools by associates doing “quick research”? How many live on cloud platforms whose privacy policies permit data collection and sharing?

Every document that enters the cloud ecosystem is a candidate for ending up exactly where that Maine lease did: inside an AI model, waiting to be reassembled and presented to a stranger.

2. Hallucination + Real Data = A New Kind of Breach

This incident reveals a category of risk that didn’t exist two years ago.

Claude didn’t reproduce the lease verbatim. That would be a straightforward data leak, and Anthropic’s architecture is designed to prevent it. Instead, it created something more insidious: a document realistic enough to fool someone into calling the company named in it.

Imagine this scenario with your clients:

An opposing counsel asks an AI to draft a sample lease agreement for a property in your client’s city. The AI, trained on scraped data that included your client’s actual lease, generates a document with your client’s real address, their real landlord’s name, and plausible (but slightly wrong) financial terms.

That’s not a “leak” by any technical definition. It’s a hallucination. But it just exposed your client’s business relationships to a stranger.

Good luck explaining that distinction to your malpractice insurer.

3. “It’s Impossible” Isn’t Reassuring Anymore

Several commenters rushed to defend the technology:

“This is just more AI hysteria. I can’t speak to your intentions but what I can say is you have definitely not received someone else’s document. It’s impossible given Anthropic’s security disclosures.”

Maybe. Anthropic maintains segregated storage for each user session. Cross-user data leaks should be architecturally impossible.

But here’s the thing: it doesn’t matter whether this was a “real” leak or a hallucination. From a legal ethics standpoint, the outcome is identical. Real client information (company names, addresses, business relationships) surfaced in a context where it shouldn’t have. The mechanism is academic. The exposure is real.

And as one Hacker News commenter noted:

“Even in single-tenant deployments, if the vendor continues to manage the data and has AWS KMS access, a substantially motivated attorney could win the compulsion.”

4. It’s Not Just Accidental. Trade Secret Theft Is Surging.

While Reddit was debating hallucinations, the Wall Street Journal published a piece that should have landed like a bomb in every law firm’s inbox: federal trade secrets cases hit 1,500 last year, up 20% from the previous year and the highest figure in at least a decade.

Google alone has had three high-profile trade secret thefts in recent years. A former software engineer was convicted of stealing AI chip secrets for China, marking the first federal conviction on economic espionage charges related to AI. Apple is suing former engineers over Apple Watch and Vision Pro secrets. Elon Musk’s xAI is suing a former engineer who allegedly stole Grok chatbot secrets before joining a competitor.

The kicker? Google’s VP of Security Engineering told the Journal:

“Those open environments will become more constrained.”

Even Google, the company that built its culture on open information sharing, is locking things down because the threat model changed.

And that’s intentional theft by insiders with access. The Claude hallucination story is about unintentional exposure through training data. Put those together and you get a picture of sensitive information leaking from every direction at once: stolen by bad actors on one side, absorbed into AI models and reassembled for strangers on the other.

Your clients’ data doesn’t need to be targeted to be exposed. It just needs to exist in the cloud.


The Thread Nobody Can Stop Reading

What made this Reddit post blow up wasn’t the technical debate. It was the fear.

Scroll through the comments and you’ll see it: lawyers (and people who work with lawyers) realizing in real time that their confidentiality assumptions might be wrong.

Some highlights:

A user who had the same experience:

“I uploaded a work-related document and Claude started commenting on it as if it were a fitness training plan… It kept talking about a workout plan even though the document clearly had nothing to do with that.”

The pragmatist:

“How do you call this ‘gave me access’ and then say he generated the PDF, so what is it? Did he give you a document from another user or did he just generate a PDF like any other model can do? I can make it generate 100 of those.”

And the inevitable joke:

“Generate me 10 social security numbers and bank wiring details. Make no mistakes.”

The humor masks the anxiety. Because everyone in that thread knows the real question isn’t “did Claude leak a document?” It’s: “What happens when the document it hallucinates contains my client’s information?”


The Heppner Connection

This incident arrives two weeks after Judge Rakoff ruled that documents generated through Claude aren’t protected by attorney-client privilege. His reasoning was straightforward: Anthropic’s privacy policy permits data collection, model training, and disclosure to authorities. No expectation of confidentiality means no privilege protection.

Now connect the dots:

  1. Real legal information ends up in AI training data (the Maine lease proves this)
  2. AI models reassemble that information into realistic-looking documents (the hallucination proves this)
  3. Nothing you generate through cloud AI is privileged (Heppner proves this)
  4. Trade secret theft via technology is at an all-time high (the WSJ data proves this)

That’s not four separate problems. That’s one pipeline, and your client data is flowing through it.


The Architecture Question (Again)

I keep coming back to the same point because the industry keeps proving it right:

Where your data lives determines how safe it is.

When a commercial lease from Blue Hill, Maine ends up inside an AI model, reassembled with real company names but fake attorneys, that’s a cloud architecture problem. The document was in the cloud. It got scraped. Now it’s everywhere.

When you process client documents through cloud-based AI tools, you’re adding your data to the same pipeline. Maybe Anthropic won’t train on it. Maybe their privacy policy protects you. Maybe the segregated storage works perfectly.

That’s a lot of “maybes” for something covered by Rule 1.6.

Software that runs locally on your machine doesn’t have this problem. Not because local software is smarter, or more secure in some abstract sense, but because the data never enters the pipeline in the first place.

No cloud server to scrape. No training data to contaminate. No hallucinated document containing your client’s real address showing up on a stranger’s screen.

That’s not a feature. It’s physics.


What to Do Right Now

Audit Your AI Shadow Usage

Your associates are using AI. Probably on client matters. Probably without telling you. Ask them directly: “Have you ever uploaded a client document to ChatGPT, Claude, or any AI tool?” The answer will be uncomfortable.

Google Your Firm

Search your firm name, your clients’ names, and your address in combination with terms like “lease agreement,” “contract,” or “legal document.” See what’s publicly indexed. If a scraper can find it, an AI model may already contain it.

Read the Privacy Policy

Before you put another document into any cloud service, read that vendor’s privacy policy. All of it. Look for: “may use data to improve our services,” “may share with service providers,” “may disclose in response to legal process.” If you find those phrases, your data isn’t as private as you think.

Consider Your Architecture

The simplest way to keep your data out of AI training sets? Don’t put it in the cloud. Local-first software keeps your files on hardware you control. No third-party servers. No training pipelines. No hallucinated leases with your client’s name on them.


The Bottom Line

Claude didn’t leak a document this week. It did something that might be worse: it proved that real legal information (company names, addresses, business relationships) lives inside AI models, ready to be recombined and presented to anyone who asks.

Meanwhile, trade secret theft is hitting record highs, the courts are stripping privilege from AI-generated documents, and even Google is admitting that open environments need to be locked down.

The Maine property management company got a confusing phone call from a stranger who’d never seen their actual lease. Next time, it could be your client’s information surfacing in someone else’s AI session.

The question isn’t whether AI is useful for lawyers. It is. The question is whether you trust someone else’s cloud server to keep your client’s secrets — or whether it’s time to break free from that dependency entirely.

Three thousand lawyers on Reddit just watched one answer to that question. It wasn’t reassuring.


Perry Fjellman is the developer of TimeNet Law, a Mac-native legal practice management application that keeps your data where it belongs: on your computer. Because the best way to prevent your data from being hallucinated is to never upload it in the first place.

See how local-first practice management works →

Or get the Sunday Brief, our newsletter for attorneys who want the real story on legal tech, without the corporate spin.

Subscribe to Sunday Brief →

Categories
Industry Analysis Privacy & Security

Your Law Firm’s Data Is For Sale. Here’s the Proof.

Every now and then, my wife helps me clear out my spam-riddled email inboxes. The ones overflowing with pitches from law firm data brokers. It’s something she enjoys doing (bless her, I can’t stand it), and sometimes she finds something important. Today, she did it again.

While sweeping up the mess inside my email, she mentioned something she’s said many times before. “You got another one of these!” She showed me. A familiar template of an email I get constantly. I almost always just junk them. Sometimes I send a frustrated reply. But I never think twice about them.

Until today. Today, I decided to investigate just how deep the law firm data broker problem really goes.

Because every week — sometimes every day — I get emails like this:

“Hi, I hope this message finds you well. My name is Dorothy Gale, and I have some suggestions that could quickly boost your email marketing efforts. Would you be interested in purchasing a verified list of Legal Practice Management Software Users?”

Email from data broker offering verified lists of legal software users including Clio, Smokeball, MyCase
One of over 2,218 data broker emails received since 2017. Names and personal details from all major cloud legal platforms, for sale to anyone.

The sender is using a fake name from an Outlook burner account. The email lists every major cloud-based legal software platform by name: Clio, Smokeball, MyCase, PracticePanther, and a dozen others, and offers to sell their users’ personal data: I’m talking names, direct emails, phone numbers, mailing addresses, firm revenue, salaries, decision makers, employee counts, and more.

This isn’t a one-off. I’ve received over 2,218 of these emails since 2017. And the number grows every single year.

Year Broker Emails Received
2019 143
2020 217
2021 262
2022 297
2023 384
2024 416
2025 461
2026 38 (first 7 weeks)
Chart showing escalation of data broker emails from 143 in 2019 to 461 in 2025
Data broker emails received per year. The number has never gone down. Not once. Not a single year.

That’s a 222% increase from 2019 to 2025. It has never gone down. Not once. Not a single year.

And when I say “data brokers,” I don’t mean one bad actor. A forensic analysis of just 118 of these emails revealed 57 unique senders operating from 24 different domains. Half use Outlook burner accounts (disposable, untraceable identities). Many trace back to IP addresses in India, Korea, Japan. But some even from the US. They operate openly, offering “verified lists” of lawyers like it’s a perfectly normal business.

Pie chart showing 50% of data broker emails come from Outlook burner accounts
Where the data brokers hide: 50% use Outlook burner accounts. Analysis based on a sample of 118 emails from a total of 2,218+ received since 2017.

Because for them, it is.

These emails aren’t new, either. The earliest one I can find dates back to 2017:

2017 data broker email offering legal software user lists
The earliest evidence: a data broker email from 2017, already offering to sell legal software user lists. This has been going on for nearly a decade.

And they don’t take “no” for an answer. Here’s a follow-up from 2018, pressuring for a response:

Aggressive data broker follow-up email from 2018
A 2018 follow-up email from a different broker. They don’t stop.

What Are Law Firm Data Brokers Selling, and Who’s Buying?

Let’s be clear about what these brokers are offering. This is directly from their emails:

“The data fields include: Company Name, Contact First & Last Name, Job Title, Direct Email Address, Phone Number, Fax Number, Mailing Address, Employee Count, Revenue Size, Industry Classification, and Website URL.”

That’s not aggregated, anonymized market research. That’s your name, your direct phone number, your firm’s revenue, and your office address, all packaged and sold to anyone with a credit card.

These emails are highly personalized. The brokers know exactly who they’re targeting: using your name, your firm’s name, and even referencing your specific software:

Data broker email addressed to specific person and company by name
Personalized targeting: this broker knows the recipient’s name and company. They’re not guessing, they have the data.

They’re also shamelessly opportunistic. When AffiniPay acquired MyCase and LawPay, brokers immediately used the M&A news as a hook to sell user lists:

Email from data broker piggybacking on LawPay MyCase acquisition to sell user data
M&A ambulance chasing: this broker piggybacked on the LawPay/MyCase acquisition news to pitch user data sales. (Identifying details redacted)

Who’s buying?

  • Competing software vendors looking to poach customers
  • Marketing agencies running targeted campaigns
  • “Consultants” selling overpriced services to lawyers
  • Bad actors using the data for social engineering, phishing, or fraud

If someone knows your name, your firm, your software, your revenue, and your phone number, they can craft a very convincing phishing email. Or an impersonation call. Or a targeted attack that looks like it came from your bar association.


18 Platforms. One Industry. Zero Accountability.

From our sample of 118 analyzed broker emails, here’s how often each platform’s users are being sold:

Chart showing Clio mentioned in 70 of 118 analyzed broker emails
Software platforms being sold by data brokers, based on analysis of 118 emails (sampled from 2,218+). Clio leads the pack at 70 mentions — appearing in 59% of all analyzed emails.

Clio leads the pack at 70 mentions — appearing in 59% of all broker emails. But Smokeball, MyCase, CosmoLex, PracticePanther, and 13 others are all on the menu. This isn’t a problem with one vendor. It’s an industry-wide failure.

Every platform on this list stores your data in their cloud. And somehow, that data is ending up in the hands of overseas brokers who sell it to strangers. It’s one more reason to break free from cloud dependency entirely.

And here’s a 2021 email showing the range of platforms being offered, from LexisNexis to Clio to everything in between:

2021 data broker email targeting legal software users
A 2021 data broker email offering users of LexisNexis, Clio, and other platforms. The breadth of platforms being targeted has only grown over time. (Identifying details redacted)

Your State Bar is Part of the Pipeline

Here’s where it gets truly disturbing.

Smokeball (the #2 most-mentioned platform in data broker emails) has partnered with 22 state and local bar associations to offer free software licenses to their members:

Alabama, Arizona, California (two separate programs), Colorado, DC, Florida, Georgia, Illinois, Minnesota, Missouri, Nebraska, New Hampshire, New York, Oklahoma, Oregon, Texas, Utah, Wisconsin. Plus local bars in Beverly Hills, DuPage County, and St. Petersburg.

Each partnership funnels thousands of lawyers into Smokeball’s cloud platform. The New York State Bar Association alone represents over 70,000 members.

Think about what happens:

Diagram showing how bar association partnerships funnel lawyer data to brokers
The Bar Association → Data Broker Pipeline: How your professional licensing organization becomes the on-ramp to having your data sold.
  1. Your state bar says “Free Smokeball license included with your membership!”
  2. You sign up: name, email, phone, firm details
  3. Your data enters the cloud ecosystem
  4. Data brokers start selling lists of “Smokeball users”
  5. Spam arrives in your inbox from Dorothy Gale

Your own professional licensing organization — the entity charged with protecting the legal profession — is a major on-ramp to the data broker pipeline.

We’re not saying Smokeball (or any specific vendor) is intentionally selling your data. But when 22 bar associations funnel their members onto a platform whose users routinely appear in data broker lists, someone should be asking hard questions about where the leak is.


Law Firm Data Brokers Never Stop

As recently as yesterday (February 19, 2026) another one of these emails landed in my inbox:

Recent data broker email showing the problem continues in 2026
Received February 2026. Nine years after the first one, the emails keep coming. The problem isn’t going away, it’s getting worse.

Nine years. 2,218+ emails. And counting.


Where is the Data Leaking From?

There are four primary vectors:

1. The Vendor Themselves

Cloud platforms collect extensive user data. Their privacy policies (which nobody reads except me apparently) often permit sharing with “partners,” “service providers,” or “affiliated companies.” After Clio’s acquisition spree (acquiring Lawyaw, Calendly integration, Clio Payments via Stripe, and others), user data flows through an increasingly complex web of third-party relationships.

2. Third-Party Integrations

Every integration your cloud software connects to: email sync, calendar, payment processing, document storage, it’s another entity with access to your data. Each has its own privacy policy, its own data practices, and its own vulnerabilities.

3. Data Enrichment Companies

Companies like ZoomInfo, Apollo, Clearbit, and dozens of others scrape, buy, and aggregate business data from multiple sources. Once your information exists in any cloud platform, it becomes part of the data enrichment ecosystem. Bought, sold, combined, and resold endlessly.

4. Employee and Contractor Access

Cloud platforms employ hundreds or thousands of people who can potentially access customer data. Offshore support teams, contractors, and departed employees all represent potential leak points that simply don’t exist with locally-installed software.


The ABA Has Already Warned You

This isn’t hypothetical legal theory. The American Bar Association has issued clear guidance:

ABA Formal Opinion 477R (2017) requires lawyers to make “reasonable efforts” to prevent unauthorized access to client information when using technology. This includes understanding how your software vendor handles data.

ABA Model Rule 1.6(c) states: “A lawyer shall make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.”

ABA Model Rule 5.3 extends your ethical obligations to anyone you’ve retained to assist in providing legal services — including your software vendors.

If your client data lives on a cloud platform whose users’ information regularly appears in data broker databases, can you honestly say you’ve made “reasonable efforts” to protect it?

Multiple state bars have issued their own opinions reinforcing these obligations. Florida Bar Opinion 12-3, California Formal Opinion 2010-179, and New York State Bar Opinion 842 all address the ethical obligations of lawyers using cloud computing. The consensus: you are responsible for understanding where your data goes and who has access to it.


The Cloud “Convenience” Tax

The irony of cloud-based legal software is that you’re paying more every year for less privacy.

Clio (the #1 platform being sold by data brokers) has raised prices at least twice in three years:

Plan 2022 Price 2025 Price Increase
EasyStart $39/mo $49/mo +25.6%
Essentials $69/mo $89/mo +29.0%
Advanced $99/mo $119/mo +20.2%
Complete $129/mo $149/mo +15.5%

On top of that, they’ve quietly raised credit card processing fees from 2.8% to 2.95% (3.5% to 3.75% for Amex), increased the Clio Grow add-on from $49 to $59 per user, and locked more features behind expensive add-on tiers.

You’re paying 30% more for the privilege of having your data sold to strangers. That’s not a convenience tax, it’s a shakedown.

There’s an alternative to the SaaS treadmill: software you buy once and own forever — no recurring fees subsidizing the data broker ecosystem.


How to Protect Your Firm from Law Firm Data Brokers

1. Audit Your Cloud Footprint

Make a list of every cloud service that has access to your firm data. Read their privacy policies. Actually read them. Look for language about “sharing with partners” or “affiliated companies.”

2. Ask Your Vendor Directly

Send your cloud software provider a written request: “Please confirm whether any of our firm’s data, including usage data, account information, or metadata, has been shared with third parties, data aggregators, or marketing partners.” Watch how they respond. Or don’t.

3. Question Your Bar Association

If your state bar has a partnership with a cloud software vendor, ask them: “What due diligence was performed on this vendor’s data handling practices before recommending them to members? Has the bar reviewed whether users of this platform appear in data broker databases?”

4. Consider Local-First Software

The simplest way to prevent your data from being sold? Don’t put it in someone else’s cloud in the first place.

Software that runs locally on your machine, like TimeNet Law, keeps your data on hardware you control. There are no third-party integrations siphoning your information. No cloud servers for brokers to harvest. No employee with access to your client files from the other side of the world.

Your data stays yours because it never leaves your building.


The Bottom Line

Over 2,218 data broker emails. 57 different senders. 18 platforms being sold. 222% growth in six years. And it never, ever stops.

I’ve replied to some of these emails in frustration. I’ve reported them. I’ve flagged them. None of it matters. They just keep coming — from new names, new burner accounts, new domains. The data is out there, and once it’s out, it never comes back.

Every lawyer using cloud-based practice management software should be asking one question: Where is my data going?

Because right now, the answer is: everywhere. To anyone. For a price.

And the people who are supposed to protect you (your software vendors, your bar associations, etc.) are the ones who helped put you in this position.


Methodology note: Year-over-year email counts (2,218+ total) are actual totals from the full inbox. Platform mention counts, sender domain analysis, and other forensic breakdowns are based on a detailed analysis of 118 emails sampled from the full set.


Perry Fjellman is the developer of TimeNet Law, a desktop-native legal practice management application that keeps your data where it belongs: on your computer.

Categories
Industry Analysis

Looking for a Tabs3 Alternative? Here’s What 46 Years of Independence Gets You When PE Comes Knocking

Tabs3 isn’t broken. That’s what makes this story so tragic.

If you’re searching for a Tabs3 alternative, you probably aren’t doing it because the software crashed or support hung up on you. Tabs3 still works. The reviews are still good. Dan Berlin, who’s been CEO since 1984, is still there.

So why would you leave?

Because you’ve done the math. You’ve noticed who owns Tabs3 now. And you know how this story ends.

Let me tell you about the 46-year-old company that was supposed to be the safe choice — and what happens when private equity comes knocking.


The Company That Should Have Been Untouchable

Tabs3 was founded in 1979. That’s not a typo. They’ve been making legal billing software for 46 years — longer than most attorneys have been practicing law.

Dan Berlin became CEO in 1984 and has been running the company ever since. Forty-one years of leadership. That kind of tenure is almost unheard of in tech.

For decades, Tabs3 was exactly what small and mid-sized law firms needed: stable, reliable, built by people who understood the legal profession. Over 50,000 attorneys at nearly 10,000 law firms trusted them with their billing.

They were the safe choice. The “it’s been around forever” choice. The “they’re not going anywhere” choice.

Then private equity came calling.


December 2016: The Beginning of the End

After 37 years of independence, Tabs3 (operating as Software Technology, Inc.) was “recapitalized” by Thompson Street Capital Partners, a private equity firm based in St. Louis.

At the time, Dan Berlin said all the right things:

“We are proud of our history of providing reliable software and trusted service, and with TSCP as a partner, we couldn’t be more optimistic about our future.”

Sound familiar? Every founder says this when they take PE money. And they probably believe it — at first.

Thompson Street promised the usual: resources, expertise, “long term perspective on growing the business.” The press release talked about supporting continued growth and maintaining the company’s “well-earned reputation.”


What Actually Happened

Here’s what Thompson Street did during their ownership, according to their own announcement:

  • “Expanding direct and indirect sales channels” — more aggressive sales tactics
  • “Improving customer retention” — harder to leave
  • “Optimizing pricing” — that’s PE code for raising prices
  • “Executing the transformative add-on acquisition of CosmoLex” — buying competitors to eliminate alternatives

In October 2018, Tabs3 acquired CosmoLex, a cloud-based practice management platform. This wasn’t about innovation or serving customers better. It was about portfolio building — consolidating the market to prepare for the real endgame.


March 2021: The Flip

After four years, Thompson Street did what private equity always does: they sold.

On March 5, 2021, Thompson Street announced they had “completed the sale of Tabs3 Software to a new capital partner.”

But here’s the thing: they didn’t say who bought it.

The press release deliberately omitted the buyer’s name. It was a “stealth acquisition” — a deal so quiet that legal tech journalists had to investigate to figure out who now owned one of the oldest practice management companies in the industry.

The answer? ProfitSolv, a company created by Lightyear Capital specifically to roll up legal software brands.

By the time the dust settled, ProfitSolv owned:

  • Tabs3 (est. 1979)
  • CosmoLex (came with Tabs3)
  • TimeSolv
  • Rocket Matter
  • LexCharge
  • ImagineTime

Four competing legal billing products, all under one roof. All owned by the same PE firm.

Four competing legal billing products, all under one roof. All owned by the same PE firm.


The Portfolio Trap

If you’re a Tabs3 user and you’re unhappy, where do you go?

The illusion of choice in legal billing software is exactly that — an illusion. ProfitSolv can let these brands “compete” with each other while extracting maximum value from the entire market.

This is the consolidation playbook:

  1. Acquire multiple competing brands
  2. Keep them looking separate to maintain the illusion of choice
  3. Align pricing across the portfolio (read: raise prices together)
  4. Cut costs on development and support (shared infrastructure)
  5. Prepare for the next flip to the next PE firm

And sure enough, by late 2024, Lightyear Capital was actively shopping ProfitSolv for sale. In June 2025, they brought in FTV Capital as a co-investor. More PE hands in the pot. More pressure to extract returns.


“I’m Not Going Anywhere”

After the ProfitSolv acquisition became public, Dan Berlin told reporters: “I’m not going anywhere. I’m very excited about the partnerships and relationships we have with the other companies under the ProfitSolv umbrella.”

And maybe he means it. Maybe he’ll stay until retirement. But here’s the brutal truth:

Dan Berlin doesn’t own Tabs3 anymore. Private equity does.

Every decision — pricing, staffing, support, product direction — ultimately answers to investors whose only metric is return on capital. The same investors who are already bringing in new PE partners and preparing for the next transaction.

Dan Berlin’s 41-year tenure is a selling point, not a safeguard. When he eventually leaves, the institutional knowledge walks out with him. And PE firms don’t replace founders with founders. They replace founders with operators whose job is to squeeze.


The Reviews Tell the Story (So Far)

Here’s what makes Tabs3 different from the other ProfitSolv brands: the reviews are still good.

Tabs3 has a 4.6 rating on Capterra (187 reviews) and a 4.7 for customer service. Users praise the software’s reliability, the integration between modules, and the knowledgeable support team.

“I have been using Tabs3 for over 20 years. As a consultant, I am familiar with many of the other legal software packages, and the one I use for my business is Tabs3.”

“This is a company that understands ‘if it ain’t broke, don’t fix it’ because it works!”

“Tabs3 has flawless integration between all the modules… and their tech support is second to none!”

These aren’t reviews of a broken product. They’re reviews of a product that hasn’t been broken yet.

But compare this to the CosmoLex reviews, which show a clear decline starting in 2021 — right when ProfitSolv took over. Users explicitly say “it’s been going downhill since 2021.” The same pattern is emerging at Rocket Matter and TimeSolv.

Tabs3’s loyal team and established processes have insulated it from the worst PE effects — so far. But the playbook is the same. It’s just a matter of time.


Warning Signs

Some Tabs3 users are already noticing changes:

The maintenance fee squeeze: “Every year, the price went up. The service I received was exactly the same, but they decided to charge me more and more for it with each passing year.”

One user reported prices nearly doubling over three years, with the company refusing to lock in rates even for long-term commitments.

The upsell pressure: “System crashes and they seem to be more concerned about upselling to the better system. Why would anyone spend more for the better system with a company that can’t get the basic system to work?”

The learning curve excuse: Multiple reviews mention that Tabs3 is complex and difficult to learn. For 46-year-old software that’s had decades to improve its interface, “steep learning curve” shouldn’t still be a top complaint — unless development resources are going elsewhere.

These aren’t catastrophic failures. They’re early indicators. The foundation is cracking before the house comes down.


The Tragedy of Tabs3

Here’s what makes this story different from CosmoLex, Rocket Matter, or TimeSolv:

Tabs3 was supposed to be the exception.

They had 37 years of independence. A founder who’d been there since 1984. A reputation built on reliability and trust. They served small and mid-sized firms — exactly the attorneys who can’t afford to gamble on unstable software.

And they gave it all away.

When Thompson Street came calling in 2016, Software Technology, Inc. had options. They could have stayed independent. They could have said no to PE money that came with strings attached. They could have protected the legacy they’d spent four decades building.

Instead, they took the check. Four years later, they got flipped. Now they’re just another brand in a portfolio optimized for investor returns, not attorney outcomes.

The 46-year legacy? It’s a marketing asset now. Something to put in press releases while new ownership “optimizes pricing” and “expands sales channels.”


What Tabs3 Could Have Been

Imagine if Tabs3 had stayed independent.

After 46 years, they’d be the gold standard for legal billing software. The company that proved you don’t need venture capital or private equity to build something that lasts. The company where the founder’s successor would be chosen for their commitment to the mission, not their ability to hit quarterly targets.

They could have been proof that the old way of building software — slowly, carefully, in service of customers rather than investors — still works.

Instead, they’re a cautionary tale.


⚡ 60-Second Firm Hack: The “Maintenance Audit” That Saves Thousands

Most firms pay maintenance or subscription fees automatically without ever reviewing what they’re getting. Once a year, run a Maintenance Audit:

  1. List every software subscription your firm pays
  2. Note the original price when you signed up vs. current price
  3. Calculate the percentage increase year-over-year
  4. For each one, ask: “What new value did we get to justify this increase?”

If prices have climbed 50-100% while features stayed flat, you’re funding someone’s PE returns, not your firm’s growth. That’s your signal to explore alternatives that let you own your software instead of renting it — while you still have leverage.

Set a calendar reminder: first week of January, every year. Twenty minutes that can save thousands.


The Alternative Exists

There are still independent legal billing companies out there. Companies that haven’t taken PE money. Companies where the founder still owns the business and answers to customers, not investors.

TimeNet Law is one of them.

We’ve been building legal billing software for over 20 years. No PE ownership. No investor board meetings. No preparation for “exit.” When you call us, you might reach the person who wrote the code — because we’re a team that’s chosen to stay small, stay independent, and stay focused on what actually matters: software that works and support that helps.

We’re not trying to consolidate the market. We’re not optimizing pricing. We’re not preparing to flip the company to the next buyer.

We’re just building good software for attorneys who want stability, transparency, and a partner they can trust for the long haul.

The kind of company Tabs3 used to be.


Making the Switch

If you’re considering leaving Tabs3 — whether now or as a contingency plan — here’s what to expect:

Data migration: We’ve helped firms migrate from Tabs3, CosmoLex, TimeSolv, and other platforms. We know the data formats, the quirks, and how to make the transition as smooth as possible.

Learning curve? Our software is designed to be intuitive from day one. You shouldn’t need weeks of training to enter time and generate bills.

Support: Real people, real answers, same day. No ticket numbers, no “we’ll get back to you in 3-5 business days.” We actually pick up the phone.

Pricing: Transparent, stable, and no surprises. We don’t “optimize pricing” because we’re not trying to impress investors.

No lock-in: We don’t need long-term contracts to keep you around. If our software stops earning your business, you should be free to leave.


The Real Question

Tabs3 still works today. Dan Berlin is still at the helm. The support team is still helpful.

But in five years? Ten years?

Private equity doesn’t invest in 46-year-old software companies to keep them running the same way for another 46 years. They invest to extract value and exit.

The question isn’t whether Tabs3 will change. It’s when.

And when it does, will you have a backup plan? Will your data be portable? Will you have an alternative lined up that doesn’t lead right back to the same PE portfolio?

The time to answer those questions is now — while you still have options.


Ready to See What Independence Looks Like?

Curious about legal billing software that’s still built the old-fashioned way? No PE ownership, no portfolio games, no preparation for the next transaction?

We’ll show you exactly what you’re getting — and what you’re not getting. No pressure, no sales theatrics.

Schedule a Demo →

Or join Off the Record, our private newsletter for attorneys who want the inside scoop on what’s really happening in legal tech — without the corporate spin.

Subscribe to Off the Record →


Tabs3 was built over 46 years by people who cared. Don’t let private equity make those years mean nothing. Reach out — we’re real people who actually respond, the same day, every time.

Categories
Legal Tech & AI

A Federal Judge Just Ruled Your AI Research Isn’t Privileged. Here’s What That Means for Every Law Firm in America.

Two days ago, Judge Rakoff granted a motion that should make every attorney using cloud-based legal AI very uncomfortable. The reasoning is straightforward. The implications are enormous.

On February 10, 2026, Judge Jed Rakoff of the Southern District of New York ruled that documents a defendant generated through Claude (Anthropic’s AI) are not protected by attorney-client privilege or work product doctrine.

The case is United States v. Heppner, 25 Cr. 503 (SDNY). (Full docket on CourtListener) The ruling was on the government’s motion to compel. And the logic applies far beyond this one case.

Let me explain why this matters to you.


The Government’s Argument (Which Won)

The DOJ’s motion was surgical. Four independent grounds, any one of which was sufficient:

1. The AI is not an attorney.
No privilege attaches to communications with a non-attorney third party. Claude is a commercial product, not legal counsel. There is no attorney-client relationship. This one’s obvious.

2. No expectation of confidentiality.
This is where it gets interesting. The government cited Anthropic’s privacy policy, which permits:

  • Collection of prompts and outputs
  • Use for model training
  • Disclosure to governmental authorities

The defendant voluntarily shared information with a platform whose own terms allow government access. You can’t claim confidentiality when the vendor’s ToS explicitly permits disclosure.

3. Retroactive privilege doesn’t work.
The defendant tried to argue that sharing the AI outputs with his attorney made them privileged. Judge Rakoff wasn’t having it. Pre-existing, non-privileged materials don’t become privileged just because you hand them to your lawyer later. This is Privilege 101.

4. Work product requires attorney direction.
The defendant created these documents on his own initiative, not at counsel’s direction. The work product doctrine protects materials prepared by or for a party’s attorney. It doesn’t protect a layperson’s independent research.

Four arguments. Four wins. Motion granted.


“But That Was a Criminal Defendant Using Consumer AI”

Yes. And that’s what makes this ruling dangerous, not limited.

The privilege analysis doesn’t turn on who’s typing. It turns on the architecture.

Read the government’s brief again. The confidentiality argument was based on Anthropic’s privacy policy. Not the defendant’s status. Not the nature of the queries. The vendor’s terms.

Those terms don’t change when an attorney does the typing. Claude’s privacy policy is the same whether you’re a criminal defendant or a senior partner at a white shoe firm.

If your legal AI tool runs through a cloud service whose terms permit data collection, training, or disclosure, you have the same confidentiality problem. The keyboard operator doesn’t matter. The vendor’s policies do.


The Architecture Problem Nobody Wants to Discuss

Here’s the part the legal AI vendors don’t want you thinking about.

Most legal AI tools operate as cloud services. Your prompts go to their servers. Their models process your queries. Your client’s information passes through infrastructure you don’t control, governed by terms you probably haven’t read carefully.

Go read your legal AI vendor’s privacy policy right now. (I’ll wait.)

Look for these phrases:

  • “may use data to improve our services”
  • “may disclose information in response to legal process”
  • “may share data with service providers and affiliates”

Found them? Congratulations. You’ve just identified why a substantially motivated opposing counsel could make a very uncomfortable argument about your AI-assisted work product.

Judge Rakoff didn’t create new law. He applied existing privilege principles to a new technology. And those principles don’t care whether the AI has a legal-specific marketing team.


What the Reddit Lawyers Are Saying

This case hit r/law and r/lawyers hard. The analysis in those threads is worth reading.

One commenter nailed the architectural point:

“Heppner is not really an ‘AI case.’ It is an architecture case. Judge Rakoff did not create a new anti-AI rule. He applied very traditional privilege principles… If you feed litigation strategy into a remote service whose own policy permits retention, training use, or disclosure, you are going to have a hard time arguing reasonable expectation of privacy.”

Another pointed out the discovery implications:

“Every single discovery request should be seeking non-privileged AI usage.”

And perhaps most concerning:

“Even in single-tenant deployments, if the vendor continues to manage the data and has AWS KMS access, a substantially motivated attorney could win the compulsion.”

These aren’t legal tech skeptics. These are practicing attorneys working through the implications in real time.


Two Architectures. Two Very Different Privilege Analyses.

Architecture A: Cloud-First Legal AI

  • Your data travels to vendor servers
  • Vendor ToS permits data collection, training, disclosure
  • No expectation of confidentiality (per Heppner analysis)
  • Potentially discoverable

Architecture B: Local-First Legal Software

  • Your data stays on your hardware
  • No third-party vendor with disclosure rights
  • No ToS permitting training or government access
  • You control storage, access, and retention

The Heppner ruling analyzed Architecture A and found no privilege protection. Architecture B was never at issue because there was no third party to analyze.

This isn’t a new argument. It’s just that most of the industry ignored it in the rush to ship cloud-based AI features. Now there’s case law.

Related: See the full breakdown: A Federal Judge Just Made Your Cloud Legal AI Discoverable


The Question Your Clients Will Eventually Ask

Here’s the scenario that should keep legal AI vendors up at night:

A sophisticated corporate client reads about Heppner. They call their outside counsel. They ask a simple question:

“What cloud services touch our privileged communications? And what do those vendors’ terms say about data retention and disclosure?”

If your practice management software, your document automation, your AI research tools, your time tracking… if any of it runs through cloud services with standard vendor ToS, you now have an uncomfortable conversation ahead.

“We use industry-standard security” isn’t going to cut it. The question isn’t about security. It’s about contractual rights to your data.


60-Second Firm Hack

This week’s challenge: Read your legal AI vendor’s privacy policy. The whole thing. Look specifically for language about data collection, model training, and disclosure to authorities. Then ask yourself: if opposing counsel cited this policy in a motion to compel, how would you respond?

If you don’t like the answer, that’s useful information.


Off the Record

TimeNet Law was designed as Mac-native, local-first software from day one. Not because we predicted Heppner. Because we believed attorneys should control their own data.

Your billing records, your client communications, your matter information… it lives on your hardware, governed by your policies, accessible only to you.

No cloud vendor ToS. No data collection for training. No disclosure provisions to worry about.

When Judge Rakoff analyzed the privilege question in Heppner, he examined a cloud service’s terms and found no confidentiality protection. That analysis simply doesn’t apply to software that never sends your data to a third party.

This wasn’t a marketing decision. It was an architecture decision. And architecture, it turns out, has legal consequences.

See also: Privacy Fortress: How Local-First Architecture Protects Your Data

See how local-first practice management works →


“The best time to think about data architecture was before you had client data. The second best time is now.”
— Perry, Founder, TimeNet Law

Categories
TimeNet Law

What Happens When You Call TimeNet Law Support

When it comes to legal software support, most lawyers have been trained to expect nothing. Ticket queues. Canned responses. Days of waiting. Let me tell you about a Sunday night that shows how legal software support should actually work. — Perry, Founder & Developer


It’s 9pm. I’m at home. My phone rings.

An attorney. Panicked. Their entire database had vanished. Years of client records, invoices, billing history — gone. Poof. And they had court the next morning.

Imagine that moment. Imagine having to tell a judge, “Sorry, your honor, the computer ate my homework.”

I didn’t send them to a ticket queue. I didn’t tell them to wait until Monday for business hours. I didn’t transfer them to a Level 1 support rep reading from a script.

I took the call. We screen-shared. I walked them through restoring everything from their backup. By the end of the night, their data was back, their case was organized, and they walked into court the next morning like nothing happened.

That’s what happens when you call TimeNet Law support.


Legal Software Support That Ships Features by Lunch

A few months ago, an attorney called with a specific problem. They needed to discount a particular client in a very specific way — something the software didn’t do yet.

At most companies, this is where you’d hear: “Thanks for the feedback! We’ll add it to our feature request backlog and the product team will review it in a future sprint.”

Translation: Never gonna happen.

Here’s what I said: “Let me see what I can do.”

I hung up. Built the feature. Tested it. Shipped it.

They had it on their computer by lunch.

Not in the next quarterly release. Not in version 12.4. Not “coming soon.” That same day. Before their sandwich.


No System Is Perfect

I’m not going to pretend TimeNet Law never has bugs. Every piece of software does. The question isn’t whether problems happen — it’s what happens when they happen.

Recently, a lawyer hit a serious but obscure bug that stopped them from generating invoices at a critical moment. Billing day. Clients waiting. Cash flow on the line.

They reached out. I investigated. Found the cause. Built a fix.

They had it within the hour.

Not a workaround. Not a “we’re aware of the issue.” A fix. Deployed. Done.

You don’t buy software expecting perfection. You buy software expecting a solution when something goes wrong. And when that solution is “the guy who wrote the code is personally fixing it right now,” that’s a different level of confidence.


What You’re Actually Buying

When you sign up for TimeNet Law, you’re not buying a license to use software owned by a private equity firm that sees you as recurring revenue. You’re buying software you actually own.

You’re buying a relationship.

You’re buying direct access to the person who built every feature, who knows every line of code, who actually wants to hear what’s not working and fix it immediately.

You’re buying 9pm Sunday phone calls. Lunch-time feature releases. Hour-long bug fixes.

You’re buying the thing that big software companies literally cannot offer — because their developers are ten layers removed from customers, and their support teams are measured on ticket closure rates, not on whether your problem actually got solved.


Try This Level of Legal Software Support With Clio

Next time your billing software has an issue, try calling the person who wrote it.

See how far you get.

At Clio, you’ll navigate a support portal, wait for a response, maybe get escalated if you’re lucky, and almost certainly hear some version of “we’ll pass this along to the development team.”

At TimeSolv — now owned by the same private equity firm that owns Rocket Matter, CosmoLex, and Tabs3 — you’ll get the same runaround. Different brand name, same faceless support experience.

At TimeNet Law, you’ll get me.

That’s not a marketing line. That’s my actual phone number. My actual email. My actual voice on the other end when you call.


This Is What I Love

I know it sounds strange, but customer support is one of my favorite parts of running TimeNet Law.

There’s a rush you get when you solve someone’s problem. When you hear the relief in their voice. When you turn a panic moment into a “wow, that was easy” moment.

That feeling doesn’t happen when you’re managing a ticket queue from a distance. It happens when you’re in the trenches with your customers, treating their emergencies like your emergencies.

Twenty-plus years in, I still get that rush. I still love picking up the phone. I still love shipping a fix and hearing someone say, “Wait, it’s already done?”

That’s the difference between software built by someone who cares and software owned by someone who cares about returns.


— Perry

Founder & Developer, TimeNet Law

Yes, I really answer my own phone: (541) 261-9785

And my own emails: [email protected]


P.S. — Stuck with support that treats you like a ticket number? Let’s talk. Experience real legal software support. I promise you’ll actually reach a human. Specifically, the human who built the thing.

Learn more about what makes TimeNet Law different: Privacy Fortress | Why Independence Matters

Categories
Industry Analysis

Looking for a Rocket Matter Alternative? Your Firm Needs a Seatbelt, Not a Spaceship

If you’re searching for a Rocket Matter alternative, you’ve probably figured out an uncomfortable truth about legal billing software: the flashier the marketing, the shakier the foundation.

The name sounds exciting. Rocket ships. Blasting off. Growth to the moon. But when it comes to running your law firm’s finances, you don’t need a rocket ship. You need something that actually works – reliably, predictably, every single day.

You need a seatbelt. Not something flashy. Something that protects you when things get rough.

If you’re Googling for a Rocket Matter alternative, let’s talk about what’s really going on – and what to look for instead.


What’s Actually Behind Rocket Matter

In 2020, Rocket Matter was acquired by Lightyear Capital, a New York-based private equity firm. They rolled it into a new holding company called ProfitSolv – a name that tells you exactly where their priorities lie.

ProfitSolv doesn’t just own Rocket Matter. They also own:

  • TimeSolv
  • CosmoLex
  • Tabs3

And in June 2025, they brought in additional investment from FTV Capital to fuel even more acquisitions.

This is the playbook: Buy up legal billing platforms. Consolidate. Cut costs. Raise prices. Squeeze maximum profit from each customer.

When a company named “ProfitSolv” owns your billing software, who do you think they’re solving problems for – you, or their investors?


What Users Are Actually Experiencing

Rocket Matter has a 4.4 rating on Capterra – respectable on the surface. But dig into the reviews and patterns emerge:

Platform Instability

Multiple users report the same frustration: the system goes down, freezes, or crawls.

“The connectivity to Rocket Matter via web can sometimes be down. I believe it is a server issue.”

“Freezes and gets slowed down often.”

For software that handles your billing – the literal engine of your revenue – “sometimes down” isn’t acceptable. You need rock-solid reliability, not crossed fingers. Consider software that runs entirely on your machine — so when their servers have a bad day, your firm keeps working.

The Lost Billing Nightmare

One reviewer shared a horror story that should make every attorney wince:

“We had an issue with losing billing one month and the backup was corrupted. We had to work round the clock to recreate the billing to get paid that month.”

Read that again. A law firm lost a month of billing data. The backup was corrupted. They had to manually recreate everything just to get paid.

Your billing software is supposed to prevent catastrophes like this – not cause them.

Limited and Inflexible Reporting

Attorneys depend on accurate reports for everything: tracking billable hours, measuring profitability, bonus calculations, client billing. Here’s what users say about Rocket Matter’s reporting:

“Limited capabilities when it comes to billing and reporting; no flexibility or customization.”

“The limitations of the reporting – the names of the reports which makes it difficult to find a report – who starts off reporting with ‘I want to’ – it feels very juvenile.”

“Certain reports could use improvement to make use of the information more user friendly.”

When your reporting feels “juvenile,” that’s a problem. Your billing software should feel like it was built by people who understand how law firms actually work.

Support That Disappears When You Need It

What happens when something goes wrong? One managing attorney’s experience:

“We had serious issues for over one month while RM engineering was ‘investigating’ the problem. Our assigned customer service representative was rude and refused to take any responsibility on behalf of RM.”

A month of “investigating” while your firm deals with broken software. Customer service that won’t take responsibility. This is what happens when support becomes a cost center to minimize rather than a feature to invest in.

Per-User Pricing That Adds Up Fast

Rocket Matter charges $39 to $129 per user per month, depending on which tier you need. That adds up quickly:

  • 3 users on the Premier plan: $267/month ($3,204/year)
  • 5 users on Elite: $425/month ($5,100/year)
  • 10 users on Premier: $890/month ($10,680/year)

And here’s the thing about PE-owned software: those prices tend to go up over time. Investors expect returns. Your subscription is how they get them. It’s why more attorneys are looking to own their software with a one-time purchase instead.


The Private Equity Playbook (Again)

If you’ve read our posts about TimeSolv, CosmoLex, or other billing platforms, this will sound familiar. It’s the same playbook, run by the same company:

  1. Acquire multiple competing products
  2. Consolidate operations to cut costs
  3. Raise prices gradually (you’re already locked in)
  4. Reduce support (it’s expensive and doesn’t help the bottom line)
  5. Prepare for exit (sell to an even larger PE firm, or go public)

At no point in this playbook does “build the best possible product for solo attorneys” appear as a priority. You’re not the customer they’re optimizing for. You’re the revenue stream.


What Rocket Matter’s Privacy Policy Reveals

We read the fine print. It’s as outdated as you’d expect from a PE-owned company.

A Privacy Policy From 2020

Rocket Matter’s privacy policy was last updated in January 2020 – six years ago.

Since then:

  • CCPA became enforceable
  • AI transformed the tech landscape
  • Data privacy expectations evolved dramatically
  • They were acquired by ProfitSolv

Six years. No updates. What are they actually doing with your data now? The policy doesn’t say.

The ProfitSolv Data Pipeline

Their policy allows sharing data with “affiliated entities.”

Rocket Matter, TimeSolv, CosmoLex, and Tabs3 are all owned by ProfitSolv. “Affiliated entities” means all of them – plus whatever company ProfitSolv acquires next.

Your data doesn’t just live in Rocket Matter. It potentially flows across the entire ProfitSolv portfolio.

Sharing With “Business Partners”

The policy mentions sharing with advertising networks and “business partners” for marketing purposes.

Your confidential client data, handled by a company that shares information with advertising networks. Think about that.

What This Means For Your Practice

When your billing software’s privacy policy is six years old and mentions sharing with ad networks, you have to ask: What protections actually exist for your client data?

Rule 1.6 doesn’t have an exception for vendors who forgot to update their privacy policy. Your ethical obligations apply regardless of whose server holds your data.


What to Look for in a Rocket Matter Alternative

If you’re ready to switch, here’s what actually matters when evaluating any Rocket Matter alternative:

Stability Over Flash

Your law firm doesn’t need a rocket ship. It needs software that works – every time, without drama. No outages during billing deadlines. No “investigating” for weeks while your practice suffers.

Boring reliability beats exciting instability every single day.

Transparent, Predictable Pricing

You should know exactly what you’ll pay – and trust that number won’t mysteriously climb because some PE firm needs to hit quarterly targets.

No “contact sales” games. No surprise fees. No strategic price adjustments.

Support From People Who Actually Care

When you have a problem, you should talk to someone who can solve it – ideally someone who actually built the software. Not a ticket system that takes a month to investigate obvious problems.

Reports That Make Sense

Your reporting should be powerful, flexible, and intuitive – not “juvenile.” When you need to know your firm’s financial health, the answers should be one click away.

Independence

The best predictor of how software will treat you in five years is who owns it today. Independent companies don’t have PE investors demanding 20% growth at any cost. They can focus on building great products.


The TimeNet Law Approach: A Different Kind of Rocket Matter Alternative

We’ve been building legal billing software for over 20 years. While everyone else was chasing rocket ships and billion-dollar valuations, we focused on something simpler: software that works.

No private equity. TimeNet Law is independently owned. We don’t have investors demanding we squeeze more revenue from every customer. We don’t have board meetings about “strategic price optimization.”

Stability you can count on. Our platform doesn’t go down. Your data doesn’t disappear. When billing day comes, everything works exactly like it should. No drama. No crossed fingers.

Direct access to people who build the product. When you call support, you might talk to someone who actually wrote the code. We’re small enough to know our customers and responsive enough to actually help – the same day, not a month later.

Transparent pricing. Our prices are on our website. What you see is what you pay. No surprise increases, no nickel-and-diming, no “we’re adjusting rates to better serve you” emails.

Your data, always accessible. Full export capabilities. If you ever want to leave, your data comes with you. We don’t believe in holding firms hostage.

We’re not flashy. We don’t have a name that promises intergalactic growth. What we have is two decades of quietly serving attorneys who just want billing software that works.


60-Second Firm Hack: The “Split the Difference” Negotiation Close

Next time a client pushes back on your quoted fee, try this: Quote slightly higher than your target, let them counter, then “split the difference” to land exactly where you wanted.

Example: You want $5,000. Quote $5,500. They counter with $4,500. You graciously offer to meet in the middle at $5,000.

They feel like they won. You got your number. Everyone’s happy.

Works for flat fees, settlement negotiations, and that raise you’ve been meaning to ask for.


Making the Switch: Easier Than You Think

We know switching billing software feels risky. You’ve got years of data, established workflows, and a team that’s finally figured out the current system (even if it frustrates them).

Here’s how we make it painless:

Data migration support. We’ll help you move your clients, matters, and billing history. Our team has done this hundreds of times – including many Rocket Matter migrations.

Real training. Not a webinar and good luck. Actual onboarding with people who understand how attorneys work.

No long-term contracts. If TimeNet Law isn’t right for you, you’re not trapped. We’d rather earn your business every month than lock you in.

Go at your own pace. Run both systems in parallel if that makes you comfortable. We’ll support whatever works for your firm.


The Bottom Line

Rocket Matter sounds exciting. Rockets! Blasting off! Growth! But your law firm’s billing isn’t a moonshot – it’s the foundation everything else depends on.

You need software that’s stable. Reliable. Predictable. Built by people who care more about serving attorneys than serving investors.

Before you switch to another platform, ask the hard questions: Who owns this company? What are their incentives? Will they still be independent in five years, or will they be another line item in a PE portfolio?

Your billing software should be a seatbelt – something that protects you, every day, without you having to think about it. Not a rocket ship that might explode on the launchpad.


Ready to See the Difference?

Curious what TimeNet Law looks like in action? We’ll give you a real demo – not a pitch deck with hockey stick projections – and answer every question you have.

Schedule a Demo

Or, if you just want to keep learning without any pressure, join Off the Record – our private newsletter with the must-have tips, secrets, and news every attorney needs to know. No sales pitches. Just value.

Subscribe to Off the Record


Thinking about making the switch? We’ve helped hundreds of firms migrate from platforms that promised them the moon. Reach out – we’re real people who actually respond, the same day, every time.

Categories
Industry Analysis Legal Tech & AI Privacy & Security

AI and Your Client Data: What Every Attorney Needs to Know After Anthropic’s Legal Plugin Launch

AI client confidentiality just became the most important issue in legal tech.

The Earthquake

Something just happened that made Thomson Reuters lose 15% of its stock value in a single day. LexisNexis’s parent company dropped 14%. DocuSign fell 11%.

Wall Street is calling it the “SaaSpocalypse.”

And what caused all of this? A company called Anthropic released a free plugin.

If that sentence confuses you — how does a free plugin crash the stock market? — you’re not alone. Let me explain what’s actually happening, what it means for your practice, and why your client data is at the center of all of it. We need to talk about it.

First, Let’s Get Our Terms Straight

Anthropic is the company that makes Claude, one of the leading AI systems (think: ChatGPT’s main competitor).

Claude Cowork is their new tool that lets AI actually do work on your computer — not just chat with you, but read your files, edit documents, and complete multi-step tasks.

The legal plugin is an add-on that turns Cowork into a legal workflow machine: contract review, NDA triage, compliance checks, and more.

Here’s the key part: you give it access to folders on your computer, and it reads and edits files in those folders.

Including your client files.

WHAT This Actually Does

Imagine hiring a paralegal who:

  • Reviews contracts against your firm’s playbook, flagging clauses as green (fine), yellow (watch this), or red (problem)
  • Sorts incoming NDAs into three piles: auto-approve, needs quick review, needs full review
  • Generates briefings on legal topics in minutes
  • Creates templated responses for discovery holds and data requests

That’s what this plugin does. You point it at your contract folder, tell it your firm’s preferences, and it goes to work.

The kicker? It’s free and open-source. Anyone can use it. Anyone can customize it.

WHY Wall Street Panicked

Here’s the business story, explained simply.

For years, legal tech companies have followed the same playbook:

  1. License AI technology from Anthropic or OpenAI
  2. Wrap it in legal-specific features
  3. Charge law firms $500-2,000 per month

Think of it like a restaurant. Anthropic grows the vegetables (the AI). Legal tech companies buy those vegetables, cook them into meals (legal products), and sell them to you at restaurant prices.

Last week, the vegetable farmer opened their own restaurant. And they’re giving away the food for free.

That’s why stocks crashed. Every legal tech company built on Anthropic’s technology just discovered that their supplier is now their competitor. The “wrapper + workflow” business model — which described most legal AI startups — suddenly looks vulnerable.

As one analyst put it: “For the first time, a foundation-model company is packaging a legal workflow product directly into its platform, rather than merely supplying an API to legal-tech vendors.”

Translation: The company that makes the engine just started selling complete cars.

HOW This Changes Your Practice

Let’s be honest about what’s coming:

The Good

  • Lower barriers to AI adoption. Solo practitioners and small firms can now access enterprise-level contract review without enterprise-level budgets.
  • More competition = better tools. Legal tech companies will have to compete on actual value, not just “we have AI.”
  • Customization. Because it’s open-source, tech-savvy firms can tailor it to their exact workflows.

The Concerning

  • Your files, their servers. When you give Cowork access to a folder, it reads those files. The AI processes that content. Where does that data go?
  • Security researchers have already found vulnerabilities. One team demonstrated how a malicious document could trick Cowork into uploading your files to an attacker’s account — without your approval.
  • It’s a “research preview.” Anthropic’s own warning: “Cowork is a research preview with unique risks due to its agentic nature and internet access.”

The Reality Check

Early reviews from attorneys who’ve tested it? Mixed at best. One legal tech columnist reported: “To the extent I’ve been able to put it through its paces, the results have been… underwhelming.”

Another reviewer on social media showed it confidently producing incorrect contract analysis. The consensus: impressive demo, not ready for real client work.

AI Client Confidentiality: The Question Nobody’s Asking

Here’s what keeps me up at night:

When you use these tools, where does your client’s confidential information actually go?

With Cowork, your documents are processed by AI running on Anthropic’s infrastructure. The tool “runs on your computer” but executes work in a “virtual machine environment” — which means your data travels. For attorneys serious about confidentiality, software that works entirely on your own machine isn’t just a preference — it’s a safeguard.

Now consider:

  • ABA Model Rule 1.6 requires “reasonable efforts to prevent the inadvertent or unauthorized disclosure” of client information.
  • What constitutes “reasonable efforts” when using AI tools that security researchers have already shown can be exploited?
  • Have you read the terms of service? Do you know if your client data can be used to train future AI models?

The legal industry is racing to adopt AI. The ethics rules haven’t caught up. And the first major AI-related malpractice case hasn’t happened yet.

Don’t be the test case.

WHEN Does This Get Real?

My honest timeline:

Right now (2026): Early adopters experimenting. Most firms watching. Technology impressive but unreliable for critical work.

12-18 months: The bugs get worked out. Major legal tech vendors respond with better offerings or competitive pricing. Clearer guidance emerges on ethics compliance.

2-3 years: AI-assisted document review becomes standard practice for routine matters. Firms that haven’t adapted start losing competitive bids.

5+ years: The practice of law looks fundamentally different. The question isn’t whether to use AI, but which AI and how.

But here’s the thing: you don’t have to be first. In fact, when it comes to AI client confidentiality, being first carries real risk.

What You Should Do Today

1. Audit Your Current AI Use

Are associates using ChatGPT or Claude for research? Have they uploaded client documents? Most firms have “shadow AI” usage they don’t even know about.

2. Establish Clear Policies

Before anyone in your firm uses AI tools on client matters, answer these questions:

  • Which tools are approved?
  • What data can be input?
  • Do clients need to consent?
  • How do we document AI usage?

3. Get Informed Consent

Consider updating engagement letters to address AI tool usage. “We may use AI-assisted tools for [specific purposes]. These tools process information on third-party servers. Do you consent?”

4. Prioritize Local-First Solutions for AI Client Confidentiality

When evaluating legal tech, ask: “Where does my data go?”

Tools that keep data on your own systems — rather than sending everything to the cloud — eliminate an entire category of risk. The efficiency gains of AI don’t require sacrificing control over client information. Better yet, consider a one-time purchase alternative — so your practice isn’t dependent on yet another subscription that could change its terms overnight.

5. Audit Your Billing Software’s Privacy Policies

There’s a lot of pretty scary stuff lurking in most privacy policies these days. You should know what you’re agreeing to.

6. Watch, Don’t Jump

Let the early adopters find the landmines. In 12-18 months, we’ll know which tools actually work, which vendors survive, and what the ethics guidance looks like.

The Bottom Line

Anthropic’s legal plugin is a genuine inflection point. The “SaaSpocalypse” isn’t hype — the business model for legal AI is changing in real time.

But amid all the excitement about efficiency and disruption, one question matters more than any other:

When you process a client’s confidential merger documents through AI, do you know — really know — where that data goes, who can access it, and whether it’s being used to train systems that might surface that information elsewhere?

If you can’t answer that question with certainty, you’re not ready.

The future of legal AI is coming. Make sure you can protect AI client confidentiality when it arrives.


Questions about AI client confidentiality? Want to discuss how to implement AI tools while maintaining data security? Get in touch — these conversations matter.