One of the excuses I hear from managing partners: "We want to deploy AI, but we can't afford to take our team offline for a week of training."
That's a legitimate concern. Professional services economics are tight. Every hour not billed is an hour not paid. Pulling people out for a day-long training event costs real money.
The good news: you don't need to take everyone offline. In fact, the firms that get the best adoption don't do traditional training. They use a different approach entirely.
Why Traditional AI Training Fails
The typical scenario: you bring everyone into a conference room. An expert (internal or external) spends two hours explaining how to use ChatGPT or Claude. People take notes. They're excited. Then they go back to work and forget everything they learned.
Two weeks later, most of your team isn't using the tool. The ones who are have made up their own usage patterns that don't align with your governance.
The problem: training in the abstract doesn't stick. People learn by doing, not by sitting in a meeting.
The Better Approach: Training in Workflow
Instead of training people on AI, embed AI training into the actual work they're doing. This requires a different structure.
Step 1: Create "AI Champions" in Each Department
Pick 2-3 people per department who are naturally curious about new tools. Give them training first (a 90-minute focused session, not a full day). Make them the go-to resource for their peers.
This works because people are more likely to ask a peer for help than to remember something from a training. And the champions learn deeper by teaching.
Step 2: Provide Workflow-Specific Prompts and Templates
Don't train people on "how AI works" in general. Give them specific prompts they can use for their actual jobs.
Example for a law firm: "To summarize a contract, use this prompt: [specific prompt]. For initial legal research, use this one: [different prompt]."
People learn faster when they have ready-to-use tools. They can experiment and improve, but they start with something that works.
Step 3: Microlearning, Not Lectures
Instead of one long training, send 5-minute videos or written tips twice a week. "This week: three ways to use AI for email." "Next week: how to give good feedback to AI output."
Microlearning works because people consume it in small chunks during their regular workday, not in a dedicated training block.
Step 4: Monthly Office Hours
Set up a monthly 30-minute optional session where people can ask questions about AI usage. This isn't mandatory training—it's a resource for people who want to deepen their skills.
You'll be surprised how much you learn about how your team is actually using the tools from these sessions.
What You Actually Need to Train On
Keep training focused on the things that matter for your firm:
1. Privacy and Compliance — This is non-negotiable. Everyone needs to understand: what data can you put into AI tools? What's off-limits? What does your firm policy say?
2. Output Review — AI outputs should always be reviewed. This is especially important for professional services. Train people on what to look for when reviewing AI output: accuracy, tone, completeness, appropriateness.
3. Tool-Specific Features — How to use the specific tools you've chosen. But this is better done in microlearning or office hours than in a big training event.
4. Prompt Engineering Basics — The difference between a vague prompt and a good prompt is enormous. Spend time teaching people how to write good prompts. This dramatically improves AI output quality.
The Implementation Timeline
Week 1: Identify and train your AI champions (90 minutes per champion).
Week 2: Champions do informal training with their teams (30 minutes per small group, in their department, focused on their workflow).
Week 3+: Roll out microlearning (5-minute tips twice a week). Offer optional monthly office hours.
Total disruption to billable work: maybe 2-3 hours per person across the month. Compare that to a traditional week-long training block, and you're saving enormous amounts of time.
Measuring Adoption
Track these metrics after training:
- What percentage of eligible staff is actively using AI tools?
- How many office hour attendees are asking advanced questions (vs. basic ones)?
- What's the quality of AI output being generated?
If adoption is high and questions are getting more sophisticated, your training is working. If adoption is stuck at 30% and people are asking the same basic questions, you need a different approach.
The Real Secret
The firms that get the best AI adoption don't solve it through training. They solve it by making AI part of how work is done, not something separate from work.
Train in context. Train in small doses. Train through peers. Then get out of the way and let your team use the tools naturally.
Want to discuss AI strategy for your firm?
Book a free 30-minute assessment — no pitch, just practical insights.
Book a Call