Navigate healthcare compliance when using AI and large language models
If you're in healthcare and want to use LLMs with patient data, Protected Health Information (PHI), or any data covered by HIPAA, you must understand the compliance requirements. Non-compliance can result in fines up to $1.5 million per violation category, per year.
The Health Insurance Portability and Accountability Act (HIPAA) is a federal law that protects sensitive patient health information from being disclosed without patient consent or knowledge.
Key Components:
PHI is any health information that can be linked to a specific individual. This includes obvious identifiers and seemingly innocent data when combined.
18 HIPAA Identifiers (Examples):
A Business Associate Agreement is a legally binding contract between a HIPAA-covered entity (or another business associate) and a vendor who will handle PHI on their behalf.
⚠️ CRITICAL REQUIREMENT
If you send PHI to an LLM provider without a signed BAA, you are in violation of HIPAA. This applies even if you're just testing or using it internally. The BAA must be in place BEFORE any PHI is transmitted.
A BAA Must Include:
Not all LLM providers are willing or able to sign BAAs. Here's the current landscape:
✓ Vendors That Offer BAAs:
Google Cloud (Vertex AI)
Offers BAA for Vertex AI services including PaLM 2, Gemini, and other models. Must be on a paid plan.
AWS (Amazon Bedrock)
Provides BAA for Bedrock services with models from Anthropic, Meta, Cohere, and others.
Microsoft Azure (Azure OpenAI Service)
Offers BAA for Azure OpenAI Service with GPT-4, GPT-3.5, and other OpenAI models hosted on Azure.
Anthropic (Claude for Enterprise)
Offers BAA for enterprise customers. Contact their sales team for details.
❌ Generally Do NOT Offer BAAs (or limited availability):
OpenAI API (Direct)
Standard ChatGPT and OpenAI API do not offer BAAs for most users. Enterprise customers should inquire directly. Use Azure OpenAI Service instead for HIPAA compliance.
Most Free/Consumer LLM Services
ChatGPT Free, Claude.ai, Gemini web interface, etc. are NOT HIPAA-compliant and do not offer BAAs.
Note: This information is accurate as of early 2025, but vendor offerings change. Always verify current BAA availability directly with the vendor before transmitting PHI.
All PHI must be encrypted both when being transmitted and when stored.
Requirements:
You must be able to control who accesses PHI and maintain audit trails.
Implementation:
Understand how long the LLM vendor retains your data and ensure proper deletion.
Key Questions to Ask Vendors:
If you properly de-identify health information according to HIPAA standards, it's no longer considered PHI and doesn't require a BAA. However, de-identification must be done correctly.
Two Methods for De-identification:
1. Safe Harbor Method
Remove all 18 HIPAA identifiers and have no actual knowledge that remaining information could identify the individual.
2. Expert Determination
Have a qualified statistician or expert certify that the risk of re-identification is very small.
⚠️ Warning About Partial De-identification
Simply removing names or obvious identifiers is NOT sufficient. Data that seems anonymous can often be re-identified through combination with other data. When in doubt, treat data as PHI and get a BAA.
Even pilot projects with real PHI require a signed BAA. No exceptions.
Google Cloud, AWS, and Azure all offer HIPAA-compliant LLM services with BAAs
Maintain records of BAAs, security assessments, and compliance procedures
Ensure everyone who works with PHI understands HIPAA requirements and approved tools
Regularly assess risks of using LLMs with PHI and implement appropriate safeguards
When available, configure services to not retain data after processing
Free ChatGPT, Claude.ai, and similar consumer services are NOT HIPAA-compliant
Informal de-identification rarely meets HIPAA standards. Get expert guidance.
Ensure vendor agreements explicitly prohibit using your PHI for model training
Have your legal team review BAAs before signing. Templates vary by vendor.
Generate draft clinical notes, summaries, or documentation from physician dictation or EHR data.
Requires: BAA, access controls, audit logging
Extract ICD-10, CPT codes from clinical notes to assist billing departments.
Requires: BAA, verification process for code accuracy
Draft personalized patient education materials or appointment reminders based on patient history.
Requires: BAA, review before sending to patients
Analyze de-identified patient data for population health insights or clinical research.
Requires: Proper de-identification OR BAA if using PHI
We can help you navigate HIPAA requirements, select compliant vendors, and implement secure AI workflows for healthcare