Earlier this year we ran an informal research project. We sat down — on calls, at conferences, occasionally over bad convention center coffee — with 50 dentists who had used some form of AI clinical documentation. Some were Marea customers. Some used other tools. A few had tried something and abandoned it.
We asked open questions and let them talk. No survey, no multiple choice. What follows is an honest synthesis of what we heard.
This came up repeatedly, in different forms. Dentists who'd been writing notes manually for fifteen or twenty years had normalized the time cost. They knew it took a while. They didn't realize how much of their day it was until they had a week without it.
One dentist in Phoenix told us: "I thought I was spending maybe 5 minutes per patient on notes. I timed it after I switched and it was closer to 12. I had no idea."
The gap between perceived time cost and actual time cost was consistent. Almost everyone underestimated how long they'd been spending.
This was the most common specific concern, and it's a legitimate one. AI-generated text has a recognizable quality to it — it's often grammatically correct, structurally sound, and somehow generic. For a clinical note that might be read by a specialist, an insurance reviewer, or pulled in a legal context, generic is a real problem.
The dentists who were happy with their AI notes said the same thing: it sounds like me. Not like a template. Not like something from a different specialty. Like the way I actually document.
The dentists who had tried something and abandoned it said the opposite. The notes were usable but they weren't theirs. They spent as much time editing as they would have writing. So they stopped.
The difference between these experiences, from what we could gather, came down to how much the system had learned their specific style — and whether the system was built to learn it, or just to produce generic correct output.
Almost everyone asked about this. Who has access to the recordings? How long are they stored? Are they being used to train something?
The dentists who asked and got a clear answer moved forward. The ones who asked and got a vague answer — or had to go dig through a terms of service document to find out — walked away. The privacy question isn't an objection to work around. It's a legitimate screening criterion, and practices are right to use it as one.
About a third of the people we talked to described themselves as reluctant adopters — they'd been pushed by a partner, an office manager, a colleague, or just by enough conversations about it that they felt like they had to try it.
Nearly all of them said they were glad they tried it.
The pattern was consistent: resistance driven by unfamiliarity, trial driven by peer pressure or opportunity, attitude change driven by actual experience. The thing that flipped skeptics wasn't better marketing or more compelling demo videos. It was using the product for three weeks.
One dentist in Atlanta, who had actively argued against AI documentation at a study club, started using Marea after a colleague offered to pay for the first month as a bet. He lost the bet. "I told everyone it wouldn't work," he said. "It worked."
About a quarter of the 50 had tried AI documentation and gone back to manual.
The reasons fell into two categories.
The first was a quality problem — the notes weren't good enough to reduce editing time meaningfully. If you're spending 8 minutes editing an AI draft instead of 10 minutes writing from scratch, the calculus doesn't work.
The second was an integration problem. Notes produced in a separate system that had to be manually transferred into their practice management software — that extra step, every patient, every day, was enough friction to kill adoption. If it doesn't connect to where the chart lives, it's a side project, not a workflow change.
Both problems are solvable. The quality problem is a vendor selection question. The integration problem is something you should verify before you start, not after.
The dentists who've had the best experience with AI notes share a few things: they gave it three weeks before judging it, they were willing to review the output rather than assume it was wrong, and they chose a system that had real dental-specific training rather than a general medical product.
The dentists who had the worst experience either tried something that wasn't built for dental, or tried it for a week and abandoned it before the system had time to learn their patterns.
Three weeks is the number we've landed on internally. That's roughly how long it takes for a dentist's documentation style to show up clearly in the output, and for the benefits to become large enough to feel obvious. Before that, it's a bet. After that, it usually pays off.
Takes minutes to set up. Nothing to install. Your existing PMS stays exactly where it is.