← Blog
dental AIAI adoptionMarea

What We Got Wrong About How Dentists Would Use AI

Isabella Tomassi·February 24, 2026

Building a product requires making predictions about how people will use it. Some of those predictions hold. Some don't survive contact with actual customers. Here's a clear-eyed look at where our assumptions about dental AI adoption were off — and what we learned.

We thought the pain of missed calls would be the hook. It wasn't always.

Missed calls are a real and quantifiable problem. We knew that going in, and it's central to how we talk about the receptionist product. We assumed it would be the immediate, emotional driver for most practices.

In reality, practices vary a lot in how acutely they feel this pain. A practice with a full schedule and a two-month waitlist doesn't feel urgency around missed calls — they almost can't take new patients anyway. A growing practice in a competitive market feels it intensely.

We spent too long in the early days pitching missed call pain to practices for whom it wasn't the sharpest edge. The better question turned out to be: "What's the part of your operations that's making you feel most stretched right now?" The answer to that question pointed to the real entry point for different practices — sometimes it was calls, sometimes it was intake forms, sometimes it was documentation falling behind.

We've gotten better at asking that question before we start pitching.

We underestimated how much the front desk staff would shape adoption.

We thought dentists would be the primary decision-makers, and we built our outreach and demo approach around them. In practice, the front desk team — receptionists, practice managers, office administrators — turns out to be more central to whether AI adoption works than we initially understood.

These are the people who use the system every day. If they feel like it's competing with them, threatening their jobs, or just adding complexity to an already complicated role, they find ways to work around it. If they understand it as a tool that helps them, they become its advocates. We've seen both.

The practices with the smoothest rollouts are the ones where the front desk team was part of the conversation from the start — where the dentist said "I want your input on this" instead of "I decided we're doing this." That sounds obvious in retrospect. It wasn't front and center in our thinking when we started.

We now specifically ask practices to include their front desk manager in the onboarding call. That one change made a measurable difference in activation rates.

We thought the first feature a practice adopted would stay their main use case.

Our mental model was that a practice would start with one product — usually the AI receptionist — and that would be the core of what they used, with other features as add-ons.

What actually happens is that practices move faster across features than we expected, and their primary use case often shifts over the first six months. The practice that comes in for the receptionist frequently ends up most excited about the notes. The one that comes in for intake forms ends up relying most on the referral letters.

We're not sure exactly why this is, but our hypothesis is that the AI receptionist creates a kind of trust in the platform — once practices see AI doing something well in one area, they're more willing to try it in another. The barrier to adopting the second feature is much lower than the barrier to adopting the first.

This matters for how we build. We've invested more in the cross-product experience — making it easy to activate a new feature after you're already using one — because the data told us that's how practices actually move.

We assumed concerns about AI accuracy would be the main objection. They weren't.

We spent a lot of time preparing answers to "what if the AI gets something wrong?" It's a real concern and a fair one, and we take it seriously.

But the more common objection in actual conversations has been simpler and more practical: "We don't have time to figure this out right now." Not skepticism about the technology. Just capacity. The practice is busy. The person who would set this up is already stretched. Adding something new feels like one more thing.

This is a real constraint, not an excuse. We've responded by making the onboarding much lighter — practices can be live in under an hour now, with minimal configuration required upfront. We've also built a better picture of what the first two weeks look like, so practices going in know the time commitment is bounded.

The accuracy question still comes up. But solving the "we don't have bandwidth" problem has moved more practices than solving the "we don't trust the AI" problem.


The honest version of building a product in a new category is that you're continuously correcting your model of the customer. These are a few of the corrections we've made. More are coming.

See it working in your practice.

Takes minutes to set up. Nothing to install. Your existing PMS stays exactly where it is.