LLMs and EHR Integrations
The clinical documentation use case has seen significant activity in recent months. The news site Healthcare IT Today lists nearly 20 ambient clinical voice vendors, with AI-powered tools that document clinical visits and automatically generate summaries. These vendors range from recently launched startups to Nuance, a speech recognition company that debuted in 1992 and was acquired by Microsoft three decades later.
A major contributor to the success of these tools is their ability to integrate with electronic health record systems, the applications where clinical staff spend the bulk of their day. It comes as no surprise that major EHR vendors such as athenahealth, eClinicalWorks, Epic, NextGen Healthcare and Cerner owner Oracle Health have released (or are working on) their own ambient AI tools, often in conjunction with partners.
“Ambient listening is freeing up doctors and nurses from spending hours on tedious documentation, allowing them to dedicate more time to interact with patients as well as reduce burnout,” Ananth says.
An additional benefit to integrating LLMs into EHR systems is the ability to layer on models that read and respond to portal messages from patients. Some health systems have adopted such models to help clinicians address a growing influx of inbound communication, much of which must be addressed outside of normal business hours due to the busy nature of clinical practice.
In multiple studies, these LLMs have shown their effectiveness in responding to patients while reducing physician workloads, though concerns remain about accuracy and the amount of time clinical staff spend generating messages.
- Mass General Brigham found ChatGPT’s messages were appropriate for patients without any edits from physicians 58 percent of the time, though about 8 percent of recommended responses posed a risk of harm. Overall, ChatGPT responses were more educational but less directive than responses from physicians. “LLM assistance is a promising avenue to reduce clinician workload but has implications that could have downstream effect on patient outcomes,” the paper concluded.
- One Stanford study found physicians using AI models to generate responses spent 22 percent longer reading the messages, and sent messages that were 18 percent longer. Some messages needed to be edited to remove clinical advice outside the scope of the patient’s original question. That said, much of the additional language was attributed to “those extra personal touches that are highly valued by patients” in messages, such as an empathetic tone.
- Another Stanford study pointed to improvements in perceived burnout and administrative burdens despite no time savings when using LLMs to respond to messages. “It may be that switching from writing to editing may be less cognitively taxing despite taking the same amount of time,” researchers concluded, adding that “perceptions of time may be different from time captured via EHR metadata.”
DISCOVER: Remote patient monitoring and AI personalize care.
AMIE and Other LLMs from Big Tech
Big technology companies are also getting into the LLM game. Amazon Web Services, Google and Microsoft all have released AI-powered documentation tools, though all three vendors have their eyes on bigger prizes.
In January, Google announced the Articulate Medical Intelligence Explorer. For now a research-only system, AMIE has been “optimized for diagnostic reasoning and conversations,” according to Google. It’s meant to help determine a patient’s possible diagnosis based on the information the patient provides to a text-based chat. This announcement came on the heels of MedLM, which HCA Healthcare has been piloting to support documentation in the emergency department.
Beyond the work of Nuance, Microsoft last October announced Azure AI Health Bot, a cloud service that comes with a symptom checker and medical database and is meant to help organizations develop their own LLMs. According to the company, insurers are using the service to help members check the status of a claim or see what services are covered under their insurance plan. Providers have implemented instances for letting patients find a nearby doctor or determine the appropriate care setting given their symptoms.
AWS is similarly focused on providing the foundation for cloud-based model development through its managed service known as Amazon Bedrock. Provider organizations have built AWS-hosted LLMs for data extraction and real-time analysis to create discharge summaries and identify at-risk patients, among other use cases.