How Mayo Clinic’s Data Liquidity Strategy Succeeded Despite COVID-19

Efforts to harmonize and share data accelerated telehealth collaboration in response to COVID-19.

Your browser doesn’t support HTML5 audio

Mayo Clinic began exploring use cases for artificial intelligence in 2017. An internal work group identified more than 200 activities using AI or machine learning (ML) technology for patient care, research and innovation. That kickstarted a process of building IT infrastructure to organize the healthcare system’s 150 years’ worth of data, and it culminated in signing a 10-year partnership agreement with Google Cloud in September 2019.

Six months later, as the COVID-19 pandemic began to spread across the U.S., this effort to increase access to data across the organization was paying dividends, according to James Buntrock, Mayo Clinic’s vice chair of IT.

“We couldn’t have predicted what would have transpired. We definitely needed data liquidity,” Buntrock said during a webinar sponsored by the Healthcare Information and Management Systems Society (HIMSS) Learning Center. “We had to react very quickly and go after data that gives us better insight into bed management, into personal protective equipment, into staffing. We applied data to different types of contact tracing for employee health.”

The success the Mayo Clinic has had with its efforts to utilize its data for a variety of objectives highlights the importance of developing a holistic data strategy, preparing data to be used for analytics and identifying future use cases — all while keeping clinical workflow and security top of mind.

Set a Solid Foundation for Healthcare Data Initiatives

The first step involved building a solid foundational data layer so disparate data sets could be combined in a meaningful way. This had a two-part goal, said Aashima Gupta, Google Cloud’s director of global healthcare solutions, who participated in the webinar: to enable developers and data scientists to deploy ML projects quickly and, through the use of custom application programming interfaces, to create “empowering digital experiences” at the point of care for providers and patients.

There are three keys to getting the foundational data layer right, said Ilia Tulchinsky, the engineering director for Google Cloud Healthcare and Life Sciences, during the webinar:

  • Integrating data from multiple systems of record, both in large batch updates and incremental real-time updates
  • Harmonizing data to common standards and schema such as Fast Healthcare Interoperability Resources, or FHIR
  • Modeling data for use in AI and ML applications

“We can have the best model in the world, and it can fail to be useful if we don’t get the integration right,” Tulchinsky said. “Bringing the right data at the right time in the right way to a busy healthcare practitioner is paramount to the overall success of AI and ML endeavors.”

Supporting Telehealth and Advancing Collaboration

This model and methodology were put to the test as COVID-19 began to spread. Cloud infrastructure supported the spectrum of telehealth use cases, said Dr. John Halamka, president of Mayo Clinic Platform — not just video visits, but also tools for clinical documentation during visits and telemetry data capture from remote monitoring devices placed in homes or hotel rooms.

The cloud platform also supports large efforts. In late March, Mayo Clinic was one of nearly 1,000 organizations to form the COVID-19 Healthcare Coalition in a cross-industry response to the pandemic.

The coalition’s work comprises 15 different work streams, Halamka said, from the delivery of ventilators and personal protective equipment to treatment protocol efficacy and vaccine development. Each work stream has different data requirements. All told, coalition members are posting more than 700 data sets per day and using cloud-hosted services to “extract wisdom” from data without accessing the underlying data set and compromising its privacy.

“We could not respond to the situational awareness of supply and demand match or understanding cure efficacy and safety without using technology, cloud hosting and analytics,” Halamka said. “This is a new approach for so many organizations that are used to working on on-premises software in silos. In effect, COVID-19 has forced us to collaborate much faster and advance to many more cloud functions than we probably would have without the pandemic.”

MORE FROM HEALTHTECH: Learn about current and future applications for AI in healthcare.

Mayo Clinic Brings Innovations to the Forefront Faster

Looking ahead, Mayo Clinic has two key priorities for its data liquidity and analytics efforts, Buntrock said. One is to create a longitudinal patient record, which will ingest different data types and store them using the FHIR standard. The goal is to “reflect an evolving and complete record” that’s more comprehensive than the traditional electronic health record, he said.

The other priority is to develop what Buntrock called an “AI factory” that will enable both internal and third-party innovation efforts to use the infrastructure, provisioning and computing resources that are already in place. Without having to worry about technology and data access, efforts can focus more on collaboration among stakeholders, such as the deployment of algorithms to the clinical practice to aid in breast cancer risk prediction.

“We’re looking at the whole package — from development all the way to application — and understanding how this fits into our workflow,” Buntrock said. “Having the data is one thing. It’s table stakes. But doing something with it is very useful.”