You've told your team not to use AI tools with client data. But I guarantee they are anyway.
Someone's pasting part of a client contract into ChatGPT to get a second opinion on the language. Someone else is using Claude to draft a proposal template. A junior associate is using an AI research tool you've never heard of to analyze case law.
This is shadow AI, and it's costing you far more than you realize. Not just in governance and risk, but in actual, measurable business impact.
The Three Costs of Shadow AI
1. Data Loss and Compliance Risk
Every time someone pastes client information into a consumer AI tool, that data is potentially being used to train the AI model. It's going to servers you don't control. It's subject to data breaches that are not your responsibility but are your liability.
For a professional services firm, this is existential risk. One partner casually pasting a client's confidential tax return into ChatGPT could result in:
- Client notification requirements
- Regulatory investigation (IRS, SEC, depending on industry)
- Reputational damage
- Malpractice claim
- Violation of client engagement letters and confidentiality agreements
The cost of a data breach in professional services can easily exceed $500K once you factor in legal, notification, and credit monitoring costs.
2. Duplicated Tool Spending
Shadow AI creates tool proliferation. Someone finds a clever research tool and starts using it. Someone else finds a different automation platform. Before you know it, you have six different AI tools doing similar work, none of them integrated with your actual workflows.
You're paying subscriptions to tools you don't even know about. I worked with one firm that had 23 different AI tool subscriptions. Only seven had been officially approved. The other 16 were being paid by individual departments.
That was $84K per year in duplicate spending that brought zero organizational benefit.
3. Inconsistent Quality and Liability
When AI use is unmanaged, everyone's using different tools with different quality levels. You get inconsistent output quality, inconsistent accuracy, and inconsistent risk profiles.
If something goes wrong—an AI gave bad advice that a client relies on, or an AI output got shared with a client without anyone reviewing it—you don't have governance documentation to protect yourself. You're purely liable.
How to Quantify Shadow AI Cost
Most firms can't quantify shadow AI because they don't know the scope. Here's how to measure it:
Survey your team. Ask (confidentially): "What AI tools are you currently using for work?" You'll get surprising answers.
Audit tool subscriptions. Check your corporate credit cards, department budgets, and software management systems. What's being paid for?
Interview department heads. Ask: "What happens when someone needs to analyze client data?" You'll learn about tools you didn't know existed.
For a 100-person firm, I typically find $50K-$150K in unmanaged AI spending.
Estimate data risk. If you found out that your team has been uploading client information to unmanaged cloud AI tools, what's your exposure? Work with your GC or insurance broker to quantify it. For most firms, it's in the six-figure range.
How to Replace Shadow AI With Governance
You can't just ban AI. Your team will keep using it secretly. Instead, replace shadow AI with a managed, approved alternative:
1. Establish an approved tool stack. "Here are the three AI tools we support. Here's how they work. Here's what you can use them for."
2. Make approved tools easy to use. If your approved tool requires API integration and your team's preferred tool is ChatGPT on the web, they'll keep using ChatGPT. Make the approved solution lower friction than the shadow solution.
3. Train on data privacy. Be explicit: "You cannot put client data into any AI tool without written approval. The cost of doing this is enormous." Most people aren't trying to create risk; they just don't understand the implications.
4. Monitor and measure. Use your endpoint tools to see what's being accessed. If you see ChatGPT being used heavily, that signals a need you're not meeting with your approved tools.
The Real Opportunity
Shadow AI is a symptom of unsolved problems. Your team is using unapproved tools because they're solving real problems that your firm isn't addressing.
Instead of just shutting down shadow AI, ask: "What problems are people solving? What would make them not need these tools?"
Then solve those problems with approved tools and governance. That's how you eliminate shadow AI and unlock real value.
Want to discuss AI strategy for your firm?
Book a free 30-minute assessment — no pitch, just practical insights.
Book a Call