For two years, a huge problem faced firms trying to adopt AI: people weren't using the tools. Deployment rates were 20–30%, not the 80%+ needed for real impact. Firms would spend thousands on tools and thousands on training, and then watch as staff went back to their old ways.

By 2026, that's changed. The firms that figured out adoption are seeing 60–80% usage. Here's what actually works.

What Didn't Work (And Why It Failed)

Top-Down Mandates

"Everyone will use AI starting Monday." This failed because it generates resistance. Knowledge workers don't like being told to change how they work. They push back passively—they try the tool once, decide it's not for them, and revert to their old process.

Generic Training

"Here's how to use ChatGPT." This failed because it wasn't relevant to their work. A consultant learning ChatGPT in theory doesn't see how it helps with their actual client deliverables. Generic training feels like busy work.

Adoption Metrics Without Consequences

"We want 50% adoption by Q2." This failed because there were no stakes. If adoption was 30%, nothing happened. No performance review impact, no promotion implications, no consequences. Why change?

One-Size-Fits-All Tools

"Use this AI system for everything." This failed because different jobs need different tools. A researcher, a proposal writer, and a client relationship manager have entirely different needs. One tool can't serve all.

What Actually Works

1. Make It Part of the Job Description

The shift that changed everything: making AI use part of the actual job expectation. Not "try to use AI," but "your job includes using AI tools appropriately."

This sounds subtle. It's actually transformational. When AI use becomes a job requirement, people take it seriously. They learn. They integrate it. They stop asking "should I use this?" and start asking "how do I use this?"

This works especially well if baked into performance reviews. Not as a separate metric, but as part of job performance. "Did they use appropriate tools?" is now a question in evaluations.

2. Demonstrate Relevance to Their Specific Work

Generic training failed. Specific training works. A session called "How to use AI to write better client proposals" is infinitely more useful than "Here's how AI works."

The firms that cracked this did training by role: researchers got training on research AI, proposal writers got training on writing AI, project managers got training on project AI. Each group learned tools specific to their work.

3. Measure Personal Impact, Not Just Adoption Metrics

Instead of "What % of staff used AI?" ask "How much time did they save?" Tell people: "Your AI usage saved you 5 hours last week. That's time you can use for billable work." Make it personal. Make it visible.

When people see their own productivity improve, they adopt. When they just see usage metrics, they comply without believing.

4. Make It Easy to Succeed

The firms with highest adoption have:

5. Lead by Example (Really Lead)

The biggest adoption driver in firms with 60%+ usage: senior partners visibly using AI. Not just saying "you should use it." Actually using it. Talking about how they use it. Sharing how it's helped their practice.

This can't be fake. Partners see through that. But a genuine "I use AI for X, and here's what changed" is incredibly powerful.

6. Remove the Fear

A lot of adoption resistance comes from fear: fear of looking dumb for asking how to use it, fear of the tool making mistakes, fear of seeming less skilled if AI is helping. Address these directly.

Firms with high adoption have:

7. Integrate Into Existing Tools and Workflows

Tools that live in your CRM, your proposal software, your document system—those get used. Tools that require opening a separate application or changing workflow—those don't. Integration is adoption.

What Didn't Need to Change

Interestingly, the tool quality and capability didn't need to improve much. Claude and GPT-4 are good enough. What changed was how firms approached adoption.

The breakthrough wasn't "better AI." It was "better change management."

The Timeline to 60%+ Adoption

Firms that are now seeing 60%+ adoption typically followed this timeline:

This is a 6–12 month process, not a 3-month rollout.

Why This Matters Now

Adoption was the limiting factor in 2024 and 2025. Firms had great tools but couldn't get people to use them consistently. By 2026, that constraint is loosening. The firms that solved adoption are accelerating. The firms that didn't are falling further behind.

And here's the kicker: once you hit 60%+ adoption, the culture shifts. AI becomes normal. New people adopt faster because they see everyone using it. Your AI strategy can move from "how do we get people to adopt?" to "what's our next AI use case?"

If You're Behind on Adoption

Don't despair. It's fixable. But you need to shift from "we have a great tool" to "we have a great tool AND we've built adoption infrastructure."

Start with one use case. Train by role. Measure real impact. Make it part of the job. Share success stories. Then expand.

You can go from 20% adoption to 60%+ adoption in 12 months if you approach it systematically. But it takes actual work. Not technology work. Change management work.

The good news: after three years, we finally know what that work is and how to do it. The path is clear.

Want to discuss AI strategy for your firm?

Book a free 30-minute assessment — no pitch, just practical insights.

Book a Call