Artificial intelligence is positioned to transform healthcare, even if AI’s most alluring and headline-grabbing promises—like predictive diagnostics that anticipate illness before symptoms appear—remain projections for the future. While much of the spotlight is on what is to come, a quiet AI transformation is underway.
A survey conducted by the Medical Group Management Association in the summer of 2024 found that 42% of medical group leaders reported using some form of ambient listening AI.1 These systems, sometimes called “AI scribes,” aim to capture interactions between physicians and patients through discrete microphones installed in examination rooms. The AI then generates suggested notes and billing codes for physicians to review and enter into the medical record.
With the promise of increased efficiency in the documentation process, it is easy to see why physicians are drawn to this technology. A 2024 American Medical Informatics Association (AMIA) survey revealed that nearly 75% of healthcare professionals believe the time and effort required for documentation impedes patient care.2 In another study, over 77% of respondents indicated they often work later than desired or take work home due to excessive documentation tasks.3 Less time spent documenting allows physicians to spend more time with patients and potentially helps combat physician burnout.
The benefits of this technology are obvious and potentially transformational for physicians, but AI also brings new risks to consider. Not the least of which is patient consent. Recent evidence suggests that most patients are skeptical about the utilization of AI in healthcare. In a 2022 survey by the Pew Research Center, 60% of respondents reported they would feel uncomfortable if their provider relied on AI for their medical care.4 To help alleviate such concerns, physicians should have patients execute a detailed consent form that explains how the ambient listening system works, what is preserved, and the system’s deletion policy.
Physicians should also obtain and document a patient’s verbal consent at every visit before triggering the listening system. This ensures the patient remains comfortable having their protected health information shared with the system. Additionally, this verification is a legal necessity in states that require consent from all participants to a recorded conversation.
Discoverability is another issue physicians should be mindful of when utilizing this technology. By now, it is well known that most, if not all, malpractice litigation includes a discovery request for all relevant digital communications and metadata stored in the electronic medical record. Such data has provided fertile ground for plaintiffs’ attorneys seeking to weave a narrative in favor of their client, and ambient listening AI could be even more problematic.
While physicians are typically only privy to the AI-generated notes, ambient listening systems may also capture and store a raw audio recording of the entire patient encounter. Whether this data is retained depends on the design and configuration of the specific system. As such, physicians and practices must work with vendors to determine whether their chosen system stores complete audio recordings. If a system retains a complete audio recording of the patient-physician interaction, it will undoubtedly be discoverable in litigation.
In a worst-case scenario, there could be an inconsistency between the note in the EMR and the audio recording. Even a seemingly minor inconsistency could undermine the accuracy and reliability of the entire medical record. It may also be used to suggest that the physician failed to review the AI-generated notes adequately. Negative optics of this nature can derail otherwise defensible cases.
Against this backdrop, there is scant justification for retaining these audio logs once the AI-assisted note has been accurately added to the electronic medical record. To address this, practices should implement clearly articulated retention policies for all data captured by the AI system that is not added to the medical record. Beyond preventing the creation of unnecessarily discoverable data, a well-defined retention protocol that is consistently adhered to should ward off allegations of spoliation. Collaboration with vendors will be needed to ensure the chosen retention protocol is in place.
AI is already reshaping how healthcare operates, and these are just a few risk issues that need to be considered. As this technology evolves, its integration into everyday medical practice will only deepen. Amid these rapid advancements, physicians must remain vigilant to emerging risks, even as they navigate the often-dazzling promise of innovation.
Risk Recommendations
Physicians utilizing ambient listening AI systems should consider the following risk management steps to reduce legal risks:
- Develop clear, documented patient consent protocols.
- Implement policies for retention and destruction of audio data.
- Train providers on what is being captured and how to communicate accordingly.
- Engage legal counsel in evaluating how these systems intersect with controlling discovery rules.
References
- Medical Group Management Association. Ambient Listening AI in Clinical Practice: A White Paper by NextGen Healthcare. 2024. https://www.mgma.com/getkaiasset/b02169d1-f366-4161-b4d6-551f28aad2c9/NextGen-AmbientAI-Whitepaper-2024-final.pdf
- American Medical Informatics Association. “AMIA Survey Underscores Impact of Excessive Documentation Burden.” AMIA, January 17, 2024. https://amia.org/news-publications/amia-survey-underscores-impact-excessive-documentation-burden
- Siwicki, Bill. “AMIA: Documentation Burden Impacting Patient Care.” Healthcare IT News, January 17, 2024. https://www.healthcareitnews.com/news/amia-documentation-burden-impacting-patient-care
- Pew Research Center. 2023. “60% of Americans Would Be Uncomfortable with Provider Relying on AI in Their Own Health Care.” February 22, 2023. https://www.pewresearch.org/science/2023/02/22/60-of-americans-would-be-uncomfortable-with-provider-relying-on-ai-in-their-own-health-care/
