Close

Join the Insider Program

Explore exclusive HealthTech coverage and enjoy early access to the latest stories.

Jul 15 2019
Digital Workspace

3 Ways Google Is Taking Healthcare Tech to New Heights

The tech giant is harnessing patient data and AI to develop a host of tools designed to transform the delivery of care.

Recent business moves by Google and its parent company Alphabet prove that the tech giant is investing heavily in artificial intelligence and data with hopes to reinvent the $3 trillion U.S. healthcare industry. 

Staffing and corporate acquisitions are chief among them. In November, Google hired David Feinberg, the former CEO for Geisinger Health, to lead the company’s Google Health initiative. Days later, it absorbed DeepMind Health, part of a British artificial intelligence company that produced an AI-powered assistant for nurses and doctors

Both strategies are key to developing a wide range of complex and intuitive tools. 

“AI holds the potential for some of the biggest advances we are going to see,” Google CEO Sundar Pichai said last year at a town hall event in San Francisco. “You know whenever I see the news of a young person dying of cancer, you realize AI is going to play a role in solving that in the future, so I think we owe it to make progress.”

Here’s a look at some of the transformative technologies in development:

MORE FROM HEALTHTECH: Check out what the FDA has to say about AI-based medical devices.

1. AI Helps Doctors Complete Routine Tasks

Google, in conjunction with Stanford Medicine, is beefing up an early-stage research project called Medical Digital Assist as it explores ways to use artificial intelligence to improve visits to the doctor’s office.

The primary aim: using Medical Digital Assist to leverage speech and voice recognition technologies that can help physicians with note-taking and paperwork.

This tool, which can listen in on conversations between a doctor and patient, not only transcribes dialogue but also takes relevant notes automatically as a means to help care teams better coordinate, Android Headlines reports

By picking up on key words in these conversations, the system is able to interpret medical terminology and, in turn, decipher which parts of the conservation are of particular importance.

Still, the exchange is far from foolproof. The technology has an error rate of about 20 percent, according to a recent study; Google and Stanford continue to train the AI system to improve clinical outcomes.

2. Google Uses AI to Help Cure Blindness

Three years ago, Google announced the development of an image library that helps itself, and other organizations, train AI models to detect diseases such as diabetic retinopathy — one of the world’s fastest growing causes of blindness. 

In this case, Google trained the models using images it collected from a computer vision system that could read images of retinal fundus, or the interior lining of the eye

As chief medical officer and chief of retina services at Aravind Eye Hospital in India, Dr. R. Kim recently witnessed this technology during Google’s first real-world clinical use of the algorithm. 

The algorithm, Kim said, offers “more time to work closely with patients on treatment and management of their disease” while also increasing the number of screenings a facility can perform. And its performance is on par with the service provided by U.S. board-certified ophthalmologists, Google AI asserts.

“[W]e hope our study will be just one of many compelling examples to come demonstrating the ability of machine learning to help solve important problems in medical imaging in healthcare more broadly,” say Dr. Lily Peng, product manager for Google AI, and Varun Gulshan, research engineer for Google AI, in a blog post.

MORE FROM HEALTHTECH: See how AI automation fits into health data security.

3. Machine Learning Helps Predict Patients’ Care Needs

Google has also begun work on an electronic health record model that uses machine learning to forecast a host of patient outcomes. Among them: the potential length of a patient’s hospital stay, odds of readmission and the likelihood of death.

The Google team demonstrated in a recent experiment how their deep learning models can make prediction sets that are relevant to patients by using deidentified EHR data.

“Having precise answers to those questions helps doctors and nurses make care better, safer, and faster,” say Dr. Alvin Rajkomar, research scientist for Google AI, and Eyal Oren, product manager for Google AI, in a blog post. “If a patient’s health is deteriorating, doctors could be sent proactively to act before things get worse.”

The benefits are twofold: Not only does an intuitive EHR have the capacity to help doctors alleviate some administrative work, but it can also identify patients that need the most attention — in some cases, before an adverse event even occurs.

t: JIRAROJ PRADITCHAROENKUL/Getty Images