AI Governance in Healthcare: What the AMA’s New Policy Means for MedTech Sales and Adoption

Clinicians and healthcare administrators are broadly optimistic about AI’s potential in healthcare, and are increasingly adopting AI solutions. Advocates highlight driving a shift to value-based care, improving financial sustainability, and supporting workforce retention among the long-term healthcare challenges that AI could help resolve.

But at the same time, medical leaders and ethicists have emphasized the need for careful implementation plans. The American Medical Association (AMA) notes concerns about AI inadvertently worsening bias in care, increasing privacy risks, or confidently offering incorrect diagnoses or treatments. So although AI adoption in healthcare is accelerating, MedTech sales leaders must adapt to evolving governance standards.

AI Adoption in Healthcare: Providers Are Cautious with Diagnostics

Providers and facilities so far have embraced AI tools for reducing workflow burdens while taking a slower approach to diagnostic solutions. A 2024 survey showed that although 66% of physicians are using AI in some capacity, only 12% were using it for assistive diagnosis. And 34% of physicians were not using AI at all.

Source: Governance for Augmented Intelligence, AMA 2025

The hesitancy to use AI for diagnostic or treatment decisions may increase due to a recent study in the Journal of the American Medical Association (JAMA). The study points out that nearly all artificial intelligence-enabled medical devices (AIMDs) are cleared through 501(k) review, which doesn’t require human testing and provides limited clinical evaluation. That easier path to market is possibly contributing to recall issues with AIMDs.

Study findings show that among 950 AIMDs, 6.3% were associated with recalls, with an average of 3 recalls per faulty device. While the overall recall rate is close to that of non-AI 501(k) reviewed devices, nearly 44% of those recalls occurred within the first 12 months of release. Those early recalls are double the rate of non-AI 501(k) devices, with the majority of AIMD recalls related to diagnostic or measurement errors. 

To help providers navigate this rapidly-developing market, the AMA recently released a toolkit to help establish a governance framework for implementing and managing AI solutions. “Now is the time to do this hard work to set ourselves up for success. It’s what is going to allow these tools to actually reach the potential that we hope that they will reach,” noted Margaret Lozovatsky, MD, chief medical information officer and vice president of digital health innovations at the AMA. 

The toolkit lays out foundational pillars of responsible AI adoption that include practical steps like assessing existing policies and creating AI specific ones, forming working groups to share developments and learning, and establishing an oversight and monitoring process for AI tools. 

How AI Governance Impacts MedTech Sales Teams

Healthcare AI governance policies will directly influence how medical device companies need to position AI-driven sales solutions. Providers can already be reluctant to move away from technology they are comfortable with. The AMA guidance and JAMA study will only increase the reluctance to adopt AIMDs without a clear benefit to doing so. 

Companies also will need to build an ROI case showing that their AIMD clearly improves clinical and financial outcomes. An ROI pitch that uses longitudinal patient data to show how an AIMD improves patient outcomes, reduces the time between diagnosis and treatment, or allows a provider or facility to treat more patients in their region will be impactful in overcoming questions about a new technology’s benefit.

Governance will also change who the ultimate decision maker is for a purchase. Value analysis committees (VAC) have become prominent in purchasing, and MedTech sales teams have established playbooks for pitching to these committees. But hospitals and health systems that implement the AMA governance could add AI-focused stakeholders to the existing VAC review process, or add a separate AI analysis committee as a new step. The ability to succinctly add context and hard data around patient leakage, referrals, and diagnoses and treatments will be critical for convincing these committees that your AIMD improves clinical and financial outcomes. 

And although individual physicians are only one of many stakeholders in the sales cycle, key opinion leaders (KOLs) are still impactful in successful product launches. These influential voices will become even more important in encouraging the adoption of AIMDs. With the wait-and-see approach most clinicians are taking to diagnostic AI tools, having a champion who can speak to better outcomes or improved efficiency with AI technology will be critical. MedTech reps who can quickly and easily leverage their champion’s peer network can expand product usage and create a comfort level with their AIMD.

Balancing AI Adoption and Risk in Healthcare

The conversation around AI in healthcare is already evolving from hype to hard questions. Credibility and trust will be won not through dazzling hype, but through clinical rigor, transparency, and a clear demonstration of value. MedTech companies that can address the concerns of clinicians, administrators, and professional organizations, and build ROI cases rooted in measurable improvements will not only navigate the current wave of skepticism but position themselves as long-term leaders in a transformative market.

See how AcuityMD can change the way you build customer relationships

Request a Demo