Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.

Aug 09 2021
Software

HIMSS21: How NLP Could Be a Game Changer for EHR Workflows

Dr. Yaa Kumah-Crystal of Vanderbilt University Medical Center shares her experience using natural language processing in healthcare.

As clinician burnout linked to workload and the usability of electronic health records continues to make headlines, solutions using artificial intelligence are under development. 

Dr. Yaa Kumah-Crystal, assistant professor of biomedical informatics and pediatric endocrinology at Vanderbilt University Medical Center, is the project lead for the Vanderbilt EHR Voice Assistant (VEVA), and she’s sharing her expertise on natural language processing and her experience integrating Nuance Communications’ ambient technology ahead of her session at HIMSS21

“I think it’s a really interesting place that we’re emerging in, where our computers can understand the words we’re saying and can respond to us as we would like them to,” Kumah-Crystal tells HealthTech. “When we get to the point where we’re able to just dialogue and have them understand our intent, I think we can also hand off a lot of the more menial work that takes away the pleasure from medicine and focus on the connections that we have with our patients, and the more sophisticated thought work that’s involved in making diagnosis and treatment plans.”

She spoke with HealthTech about maturing EHR capabilities with the help of NLP, how VEVA has grown and how other healthcare systems can take steps toward a similar solution.

HEALTHTECH: Can you talk more about how NLP technologies linked to EHRs have evolved over the past few years? What are the capabilities now that didn't exist before?

KUMAH-CRYSTAL: NLP and getting that right is going to be one of the most instrumental things that we’re able to do to make the EHR more usable. There’s a lot of content in the EHR — a lot of it is clinical notes for providers who are trying to communicate what the evolution of the patient’s story is. Some of that is done in structured notes, but a lot of that is done as free text notes. When we can get to the point where the EHR can intuitively go through the body of information in the patient’s record and understand intuitively, like a normal person would, the natural language, the content, what the theme is, what’s being conveyed in the note, and then summarize that back to a provider, that's going be a game changer. We're taking steps in the right direction. It doesn’t fully exist yet, but I think a lot of people are thinking about how to solve this problem.

HEALTHTECH: Can you share more about your own medical experience in practice and how that led to the development of VEVA?

KUMAH-CRYSTAL: I'm a pediatric endocrinologist. The other hat I wear is in health IT, by helping to think through and develop information technologies for providers and care teams. We wanted to figure out if there was a more natural way to interact with the electronic health record by making queries and having it do the coding and the summarization for you. We’ve been partnering with Nuance to leverage some of their technologies that can understand words and parse them out, very much like these consumer devices like Siri and Echo and the Google Assistant are able to do, but in the context of medical speech. Medical jargon is very particular and very precise. Unless your speech model is trained to that, you’re going to miss a lot. If I say, “What’s the patient’s basal rate,” which is referring to how much long-acting background insulin do they have, if I tried to dictate that on my phone using Siri, it'll be B-A-S-I-L, and I'm not talking about food. The differences between that and being able to understand the context of what’s being conveyed so they can appropriately match the intent of what a provider is trying to accomplish is what we’re really trying to steer toward. When we’re developing these technologies, it hopefully will give a better experience than when I’ve had to scavenge all around for the information that’s on the computer.

READ MORE: What Microsoft’s Nuance acquisition means for the healthcare industry.

HEALTHTECH: How has VEVA grown over the years?

KUMAH-CRYSTAL: It started with the idea of, why can I talk to my phone and find out the weather or access my shopping list, but I can’t talk to my computer and find out my patient’s labs or add orders to my cart? The concept’s the same, the modality is a little bit different. Medical language is more complex. Right now, we’ve evolved a lot in terms of being able to help with ordering health maintenance; that’s been the biggest skill that we’ve come up with that providers are using a lot and really enjoying. In pediatric diabetes, there are a slew of labs that we have to order regularly on patients to screen for diseases that they’re very susceptible to, but figuring out which labs and the different criteria based on how long they’ve had diabetes, what their last lab was and whether it was abnormal or not — there are a lot of things you have to consider to figure out which labs you want to order, and that could easily take five to 10 minutes depending on how efficient you are and how buried that other information is in the electronic health record. By developing a simple skill, we’ve gotten great feedback that that’s a real timesaver. We’re working closely with our EHR vendor, Epic, to build these word and voice workflows as part of the regular EHR to scale it out. Epic is working on the Hey Epic! voice assistant, and we’re collaborating with them to think through how we can make these skills and workflows something people use as part of their daily routines, and what kind of things people want to be asking for on a more global level.

HEALTHTECH: What have you noticed about patient reactions to the use of VEVA?

KUMAH-CRYSTAL: I'm a pediatrician, so I have generally younger patients, and their parents as well, but for the most part, they're like, “Oh that’s interesting. What is that?” and the kids are generally just very fascinated by it. The technology is all still very new, and there’s this other component outside of the medical realm with people having concerns about privacy and things like that with these smart speakers, wondering what's happening to their health information. I try to be as clear as I can. The way we built this is very different than the Echo you might have in your house. That’s generally reassuring, but I think we have to do a lot of work and education to help explain to people how these workflows are managed, how their data is treated and respected. Your health data is the most personal thing about you, and you want to make sure that people are comfortable with the way you’re managing their data, because they’re trusting you with it.

Yaa Kumah-Crystal
Your health data is the most personal thing about you, and you want to make sure that people are comfortable with the way you’re managing their data, because they’re trusting you with it.”

Dr. Yaa Kumah-Crystal Assistant Professor of Biomedical Informatics and Pediatric Endocrinology, Vanderbilt University Medical Center

HEALTHTECH: What about other provider experiences?

KUMAH-CRYSTAL: We're still gathering data for time efficiencies and things like that for VEVA, but anecdotally, a provider will say, “Oh, this easily saved me three to five minutes for patients because I’d have to go hunting around for it before.” What we’re really looking at is whether using VEVA helps providers more efficiently order these labs, because what we found in our analysis of health maintenance behavior is that if people don’t have the time and it’s just too hard to do, they’re just not going to do it. But if we make it easy to do the right thing, we actually motivate people to continue to just do the processes that are important for the delivery of care. We’re going to try to do an analysis, compare the rate of lab ordering, specifically when using this workflow versus when we didn't have anything in place to facilitate it.

HEALTHTECH: What are the next steps for VEVA? Any future plans?

KUMAH-CRYSTAL: I'm really excited about our collaboration with Epic, just because it’s something that we can really work to scale the platform as a native foundation skill. What workflows does it make the most sense to use this in, if you’re by yourself in your office, if you’re rounding with a group of residents and ask for information really quickly to help fill in the gap while someone’s giving a report? Exploring those different workflows, I think, is going to be one of the more important things we can learn, and also figuring out how to ask more complex questions. Right now, it’s very single-thread, like, “What’s the sodium? What’s the A1C? Tell me her weight.” But what if I could ask whether she’s had an admission over the past three months for pneumonia? There are three different variables in there that you have to take into consideration as you’re parsing that out. Can we make that language model complex enough to handle that level of nuance? That’s the real value proposition: to be able to simplify the work that providers are doing by being able to handle all these things. But by virtue of the fact that that’s what takes time, that’s what’s going to be hard to build as well.

HEALTHTECH: What advice do you have for other health systems looking for a solution similar to VEVA?

KUMAH-CRYSTAL: I would say connect with us. Also, start thinking through what your use cases are and what your providers’ biggest pain points are. Talk to your EHR vendor, because a lot of them are starting to do work to look into this space. Figure out what the lowest-hanging fruit is — what is one skill that you can work on that can really save people a lot of time? Get your champions and superusers involved in the design early in the process. Don’t be afraid to fail fast, because that’s how you learn, that’s how you get better, by going through figuring out what doesn’t work so you can find the gems that do.

Keep this page bookmarked for our ongoing virtual coverage of HIMSS21. Follow us on Twitter at @HealthTechMag and join the conversation using the hashtags #HIMSS21 and #CDWHIMSS.

Photography by William DeShazer