AI vendors love the word 'HIPAA-compliant' on their homepage. The version that holds up in an audit is more specific than that. Here's what every dental practice should require before signing.
HIPAA compliance is one of those topics where the marketing copy and the actual obligation drift apart. A vendor saying "we're HIPAA-compliant" is roughly as informative as a contractor saying "we're licensed" — true but incomplete. What you need to know is what they sign, what they encrypt, what they log, who can see your patient data, and what happens if they have a breach. This guide walks through the specific questions to ask before any AI tool touches a single piece of patient information at your practice.
An AI receptionist that talks to patients, books appointments, and reads back insurance benefits is — by every definition that matters — handling Protected Health Information (PHI). The patient's name, date of birth, payer, member ID, and reason for the visit are all PHI. The recording of a call where they describe a toothache is PHI. The transcript stored in the vendor's system is PHI. This means the vendor is a Business Associate under HIPAA, and you, the covered entity, are required to have specific controls in place before letting that data leave your office.
HIPAA's Security Rule organizes these controls into three buckets: administrative (policies, training, designated security officer), physical (data centers, workstations), and technical (encryption, access control, audit logging). A compliant vendor needs to satisfy all three. A vendor with a SOC 2 Type II report has done most of this work and had it independently verified — that's the closest thing to a shortcut.
The Business Associate Agreement is the contract that creates a legal obligation for your vendor to handle PHI according to HIPAA. If a vendor is unwilling to sign a BAA, that conversation should end there. Don't let "we have BAAs with our subprocessors" substitute for a BAA with you — the chain has to go all the way back to you, the covered entity.
What a strong dental BAA includes:
"We'll add a BAA to your enterprise contract." Translation: it's not part of the standard agreement and you're going to have to negotiate. For most dental practices, that's a non-starter — you don't have a procurement team. Insist on the BAA being executed alongside the standard subscription, before any data moves.
Two phrases worth knowing: encryption in transit and encryption at rest. In transit means the data is encrypted while it moves between your office, the vendor's servers, the AI model, and any third parties. The standard here is TLS 1.2 or higher; TLS 1.3 is current. At rest means the data is encrypted while it's sitting in the vendor's database. The standard is AES-256.
Both are table stakes for an AI dental tool. If a vendor can't tell you exactly what they encrypt and how, that's a sign their security posture is more aspiration than implementation. Aria, like any serious AI receptionist for healthcare, encrypts everything end-to-end with AES-256 at rest and TLS 1.3 in transit, and we're happy to put that in writing.
Your patient data should be visible only to people at your practice and the smallest possible set of vendor personnel — and every access should be logged.
Specifically, ask:
The audit log question is the most revealing. A vendor that can't produce an access log on demand is one that hasn't built the muscle of being audited. That's not where you want to be the first one to find out.
Most AI receptionists rely on third-party AI models — OpenAI, Anthropic, Google, or open-source models hosted on AWS or Azure. This isn't bad. It's the way the industry works. What matters is whether those subprocessors are also bound by BAAs, and whether your data is being used to train their models.
Two specific questions to ask:
If a vendor can't show you the BAA chain from your practice down to the AI model, the gap in that chain is where your liability lives.
Breaches happen. The question isn't whether your vendor will ever have an incident — the question is whether they have a documented response process, a notification commitment, and a track record of executing on both. Ask for their incident response runbook (or a summary). Ask how they'd notify you. Ask what they did the last time something happened.
If they say "we've never had an incident," you're either the first practice to ask, or they're not telling you the whole truth. Both are problems.
Before signing with any AI vendor that touches patient data, get yes/no answers to these:
A vendor that hits all 14 is doing the work. A vendor that punts on three or more — especially #1, #2, #5, or #11 — is one I wouldn't put PHI into.
For Aria specifically, our security page covers each of these in detail; if you want to walk through the BAA, encryption posture, and audit log capabilities before a demo, ask us in the booking form.
We'll send the BAA, the SOC 2 summary, and our subprocessor list before the demo so you can do diligence before you commit time.
Book a Demo → Run the ROI Calculator