Anthropic recently released a legal plugin for Claude that handles contract review, NDA triage, and compliance workflows. The day it dropped, Thomson Reuters fell 16%, LegalZoom crashed 20%, and Wolters Kluwer lost 13%. Wall Street noticed. As a guy who spent 30 years practicing law, so did I.
I’ve thought for a long time that transactional law is the area most likely to get disrupted by AI. Contracts follow patterns. They use known language.
The donkey work of reviewing a standard NDA or employment agreement is exactly the kind of thing Claude and other LLMs are good at. Feed it a contract, ask it to flag the problems. It does a surprisingly decent job.
I know there are things a wily attorney picks up that AI just isn’t sophisticated enough to catch. The weird clause buried on page 12 that changes the entire deal. The missing indemnification language that only matters if things go sideways. The stuff you learn to spot after you’ve been burned by it once. AI doesn’t have scar tissue. Lawyers do.
But the routine stuff? Absolutely. Let AI handle first-pass review. Let it draft the boilerplate. Let it compare versions and catch what changed.
That’s real, useful work that used to cost clients hundreds of dollars an hour. AI doing it faster and cheaper is a good thing.
The danger is when people skip the lawyer entirely.
I can already see the lawsuits forming. Someone uses an AI tool to draft a partnership agreement. It looks professional. It reads like a real contract. They sign it.
Six months later they discover the AI missed something critical, or included language that means something different than they thought. Now they’re in trouble.
And here’s the part that keeps me up at night. If your attorney makes that mistake, you have recourse. Legal malpractice exists for a reason. There’s insurance. There’s accountability.
But if your AI-drafted contract has a fatal flaw, where do you go for relief? Who do you sue? The chatbot? Good luck with that.
We’re heading into a period where people are going to trust AI contracts the way they trust Google searches. Confidently and without much thought.
Some of those people are going to get hurt. Not because the technology is bad, but because they treated it like a lawyer when it’s really just a very fast research assistant.
Use AI for contract review. I do. But treat it like a first draft, not a final opinion. The donkey work is AI’s job now. The thinking is still yours (and your attorneys).


