Servers and storage are a primary focus for one hospital’s support upgrades.
Imagine using computer programs to analyze and interpret the results of X-rays, MRIs and other patient medical images.
Thanks to artificial intelligence technologies like deep learning, some healthcare organizations have already started down that path.
Take the Massachusetts General Hospital and Brigham and Women’s Hospital Center for Clinical Data Science (CCDS) in Boston. Using an Nvidia integrated system designed specifically for AI applications, researchers there trained a deep neural network to sift through a library of radiology records, 10 million strong. Eventually, the system will be able to quantify biological tissue more precisely than a human, says Dr. Mark Michalski, executive director of CCDS.
If implemented properly, deep learning and related technology could prove a boon to the profession rather than a threat, automating simpler, more routine tasks and allowing radiologists to focus their time and energy on more complex, ambiguous and high-priority aspects of the job.
Analysts with IDC estimate that worldwide spending on AI technologies will substantially increase over the next three years, reaching $46 billion by 2020. The field has so much momentum that the American College of Radiology announced in May the formation of the ACR Data Science Institute, which will work with the federal government and the radiology industry on appropriate development and deployment of AI tools to help radiologists improve care.
“We think it’s important to provide guidance on best use,” says Dr. William Thorwarth, CEO of the ACR. “All of this, in our minds, holds the potential to facilitate the radiologist’s optimal contribution to team-based healthcare.”
Michalski and Keith Dreyer, who serves as the chief data science officer for the radiology departments at Massachusetts General and Brigham and Women’s, are hopeful that the technology will someday be able to measure changes in the brain too small for an unaided human to notice reliably that, for example, mark the onset of Parkinson’s disease.
Quantification is just the start, Dreyer says. As the project progresses, he and Michalski expect the system to support an increasingly broad range of applications.
Similar work is under way at the University of California, Los Angeles Health, where radiologists leveraged deep learning to build a system capable of answering standard patient care questions by attending physicians and other clinicians via chat.
Dubbed the “virtual interventional radiologist,” the system is intended to “automate the really low-level, simple stuff that takes up 50 percent of a radiologist’s time,” says programmer Dr. Kevin Seals, a resident physician in radiology. The system automatically transfers more complex queries to a human radiologist, but becomes progressively smarter as it continues to encounter new scenarios.
Seals and his team developed the application, which is similar to online customer service chats that are fueled by AI technology.