At some point in every sales conversation, there's a pause, and then: "Is this HIPAA compliant?"
Sometimes it's the dentist. Sometimes it's the practice manager who's done their homework. Once, memorably, it was a patient who overheard us demoing and wanted to know what happened to her information.
The short answer is yes, Marea is HIPAA compliant. But the short answer doesn't actually tell you very much. Here's the fuller version.
HIPAA governs how protected health information — PHI — is stored, transmitted, and accessed. For an AI tool that interacts with patients (phone calls, intake forms) and handles clinical data (notes, referral letters), there are a few distinct obligations.
The Business Associate Agreement. Any vendor that handles PHI on behalf of a covered entity is required to sign a BAA. This is the document that defines how the vendor will protect that information, what they'll do in the event of a breach, and what their obligations are. Marea signs a BAA with every practice. If a vendor won't sign a BAA, full stop — they shouldn't be touching patient data.
Data storage and transmission. PHI has to be stored and transmitted with appropriate encryption. Our infrastructure is hosted on AWS, using encryption at rest and in transit, in regions that meet HIPAA's technical safeguards requirements.
Access controls. Who can see patient data inside Marea, and under what conditions? We maintain strict access controls. Data from one practice is never accessible to another. Our internal team has limited access to production data, and access is logged.
Audio recordings. The AI receptionist records calls. Those recordings contain PHI. They're stored encrypted, retained for a defined period, and accessible only to the practice. They're not used to train external models.
When practices ask about HIPAA, they're often asking something more specific: "Will Marea train its AI on my patient data?"
The answer is no.
Your practice's data — call recordings, notes, patient intake information — is yours. It doesn't leave your account to improve a shared model. It doesn't get aggregated with other practices' data. What happens in your practice stays in your practice.
This is worth stating directly because some AI vendors are less clear about it. If you read the terms of service carefully, some tools reserve the right to use your data for model improvement. We don't, and we say so in the BAA.
HIPAA compliance is a floor, not a ceiling.
A vendor can be fully HIPAA compliant and still have weak security practices. Compliance means you meet the legal standard — not that you've done everything possible to protect data. When evaluating any vendor, it's worth asking about their security posture beyond the compliance checklist: penetration testing, employee security training, incident response procedures.
We get these questions from larger practices and DSOs in particular, and we answer them. If you want the detailed breakdown of our security architecture, we'll send it.
If you're evaluating AI tools for your practice:
Ask every vendor to sign a BAA before you proceed. If they hesitate or push back, that's your answer.
Ask specifically whether your data is used to train shared models. Read the terms of service on this point.
Ask what happens to your data if you cancel. Do you get an export? How long is it retained? What are the deletion procedures?
These aren't paranoid questions. They're the right ones to ask anyone handling your patients' information, and any vendor worth working with will have clear answers.
We're happy to walk through any of this in more detail. It's not a topic we try to move past quickly.
Takes minutes to set up. Nothing to install. Your existing PMS stays exactly where it is.