If you run a healthcare clinic, dental practice, physiotherapy center, or similar organization, you're dealing with PHIPA and PIPEDA (in Canada) or HIPAA (in the US). Your staff wants to use ChatGPT. You're nervous about patient data.
That nervousness is justified. But there are ways to use AI responsibly in healthcare without violating patient privacy.
The Problem
ChatGPT is hosted on OpenAI's servers. Data you send to it is processed by OpenAI. According to OpenAI's terms, they can use that data to improve their model.
If you paste a patient's name, age, diagnosis, and treatment plan into ChatGPT and ask for a follow-up letter, you've just sent that information to OpenAI's servers. That's a PHIPA violation if you're in Canada, and potentially a HIPAA violation in the US.
Your staff probably doesn't understand this distinction. They see ChatGPT as a helpful tool and use it naturally. That's a compliance risk.
Where AI Can Help (Safely)
Template generation: Ask ChatGPT to create a template for a patient follow-up letter. Don't include patient-specific information. Then your staff fills in the specific details manually or in your secure system.
Scheduling and logistics: "Create a schedule template for a physical therapy clinic" or "Draft language for a common patient inquiry about payment options." No patient data involved.
Staff training: "Explain the difference between anterior cruciate ligament tears and sprains" or "Create a training outline for new front desk staff." Educational content, not patient data.
Internal processes: "Summarize these clinic policies into a one-page quick reference." Again, no patient information.
The pattern: Use AI for generic templates, processes, and information. Never use it with patient-specific data.
What You Need To Put In Place
1. Clear policy: Document what AI tools staff can and cannot use. Be specific. "ChatGPT is approved for template generation and staff training. ChatGPT is not approved for any use involving patient names, diagnoses, treatment, or contact information."
2. Training: Make sure your staff understands why. It's not about being prohibitive. It's about protecting patients and protecting your clinic from liability.
3. Alternatives: If your team needs AI help with patient information, what are the alternatives? Medical record AI systems that are HIPAA-compliant? Consultants you work with? Make sure there's a path forward, not just "don't do this."
4. Monitoring: You can't prevent all violations, but you can spot-check. "Let me see what prompts you've been using with ChatGPT." Not Big Brother surveillance, but periodic checks.
What Compliance Actually Means Here
Compliance isn't about using zero AI. It's about using it thoughtfully.
PHIPA and HIPAA are about protecting patient privacy. As long as you're not exposing patient information to unauthorized third parties (including AI vendors), you're compliant.
You can use AI. You just need to de-identify the information first.
A Practical Example
Wrong approach: "Draft a letter for a 42-year-old patient, John Smith, who came in for back pain and has been diagnosed with a herniated disc."
Right approach: "Create a template for a patient follow-up letter after diagnosis of a herniated disc. Include sections for clinical findings, treatment recommendations, and follow-up schedule."
Then your staff fills in the specific patient details in your secure system.
What You Should Do This Month
1. Have a conversation with your compliance officer or legal advisor. Ask them for guidance on AI tool usage in your specific context.
2. Create a simple policy. Don't overthink it. Document what is and isn't allowed.
3. Train your staff. Make sure they understand the boundaries and the why behind them.
4. Identify one or two use cases to test safely. Maybe template generation. Maybe FAQ writing. Something that clearly doesn't involve patient data.
Healthcare organizations are uniquely positioned to benefit from AI while also being uniquely constrained by regulation. Both of those things are true. But they don't mean you can't use it. They just mean you need to be thoughtful.