Mar 24 2020
Digital Workspace

Artificial Intelligence in Healthcare: Why Transparency Matters

Predictive capabilities have great potential in medicine. Just as important: making the functions visible and relatable to patients.

The growing presence of artificial intelligence during the care delivery process continues to show potential for reduced physician burnout and more personalized treatment.

Progress is coming in many forms, including automated scan analyses, symptom-checking chatbots and extra help with medical coding and billing. 

“At the end of the day, if we’re looking at how we can impact care and we want to do so in a way that’s going to be helpful,” said Ed Shaffer, healthcare and life sciences informatics director at Dell EMC, “then I think we’re in a state where we can’t continue to do things the way we’ve done them.” 

The shift is underway: 85 percent of Americans commonly use AI in some form, a GE survey finds, and the use of AI could create $150 billion in annual savings for the U.S. healthcare economy by 2026, according to a recent Accenture report.

Still, advancing the movement requires strategy among developers to determine safe and appropriate use cases for AI — as well as staff who must learn to integrate the augmented offerings into their workflows and justify those functions to the public.

That was the key takeaway from a recent panel with Shaffer and other healthcare thought leaders, held virtually due to the cancellation of the HIMSS20 conference, on AI’s evolving capacity to enhance clinical decision-making.

On the one hand, AI can aid lifesaving insights and efficiencies. But recipients must be fully informed, and all parties need to acknowledge that care can’t be fully outsourced. 

“Fairness, accounts and transparency need to go into the design and your relationship with patients and their caregivers,” Corinne Stroum, a director at KenSci, said during the panel discussion. KenSci is an AI-powered risk prediction platform for healthcare.

How AI Enhances Medical Diagnosis and Treatment

Speaking last month at CDW•G’s AI Showcase at Rutgers University, panelists shared similar sentiments about the challenges and opportunities ahead.

Such applications include virtual assistants to guide patients with medication reminders; algorithms and machine learning to accurately detect cancer and heart disease; and robot-assisted therapy for recovering stroke patients. 

“We’re getting to a point where machines can start telling us things we don’t know,” Jeremey Wise, a CDW solution architect for AI and deep learning, said during the Rutgers panel.

Dell EMC’s Shaffer, speaking at the virtual HIMSS20 discussion, praised the growing use of natural language processing — a branch of AI that allows computers to understand spoken or typed remarks — for saving time and errors by seamlessly transmitting data to the EHR. 

He also sees AI-driven capabilities aiding triage: “If you’ve got a workload of 25 patients and you have three that are late-stage cancer patients, you’d really like to get to those caseloads first,” Shaffer said. “Absolutely, we can get scale and do more with less and spend more time at the bedside ... being the analyst instead of the detective.” 

At Halifax Health in Florida, AI- and voice-powered documentation tools (including Nuance’s Dragon Medical Advisor) help clinicians make decisions in real time based on submitted data, said Tom Stafford, Halifax Health’s former CIO. This could shift a provider’s suspicions of pneumonia to exploring the possibility of sepsis, for example.

Put into practice, “the AI engine is saying, ‘Hey, I think you weren’t specific enough,’” said Stafford, now CTO for CDW Healthcare, told HealthTech in a recent interview. “A physician can look at that — and they can agree or disagree — but they can make that change right then and there.”

Halifax Health improved its physician engagement rate by almost 40 percent with the help of the Dragon Medical Advisor.

Targeting Problems and Perceptions to Advance AI in Healthcare

Developers and providers must continue to address roadblocks. This includes varying perceptions of artificial intelligence — a debate of “creepy versus cool” among different patient ages and demographics, said Shaffer. 

Another challenge is leveraging vast pools of data to guide the evolution of AI programs that make clinician’s jobs easier and don’t introduce uncertainties, said Dr. Rasu Shrestha, chief strategy officer and executive vice president at Atrium Health in Charlotte, N.C.

“The signal-to-noise ratio isn’t what it needs to be,” Shrestha said during the virtual HIMSS20 panel. “We have to get to the right balance so we’re able to prioritize and not play the role of detective, dealing with all of that data hitting us.” 

There’s also the concern that AI and software tools that enable robotic process automation will eliminate the jobs of healthcare workers. 

That notion, panelists agreed, may affect some administrative functions but won’t replace the core function of delivering care. 

“Most of us would not go to a physician or specialist and have the surgery performed automatically,” Shaffer said. “We would still have some sort of human intervention as part of our care management.”

Stafford considers it a key complement. “Our brains are not as strong as AI engines. We have other things we’re focused on. When that AI engine is looking at the chart, it’s using a set of rules and learning to give guidelines to a physician.”

Keep this page bookmarked for our ongoing virtual coverage of HIMSS20. Follow us on Twitter at @HealthTechMag and join the conversation using the hashtags #VirtualHIMSS20 and #CDWHIMSS.

Milan Markovic/Getty Images
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT