Using smartphones to screen for disease
The healthcare landscape is constantly evolving with the introduction of new innovations using artificial intelligence (AI) and hardware technologies.
Google Health recently unveiled how AI can be utilized both in and out of the hospital setting, using gadgets you already own, to generate quick results.
“Accessing the right healthcare can be challenging depending on where people live and whether local caregivers have specialized equipment or training for tasks like disease screening.
“To help, we have expanded our research and applications to focus on improving the care clinicians provide and allow care to happen outside hospitals and doctor’s offices,” says Dr Greg Corrado.
The senior research scientist and head of Google’s Health AI division was speaking at The Check Up, the company’s annual showcase of its health innovations, which was held virtually recently.
One research area includes exploring how a smartphone’s built-in microphone can act as a stethoscope, recording heartbeats and murmurs when placed over the chest.
He says: “Listening to someone’s heart and lungs with a stethoscope, known as auscultation, is a critical part of a physical exam.
“It can help clinicians detect heart valve disorders, such as aortic stenosis, which is important to detect early.
“Screening for aortic stenosis typically requires specialized equipment, like a stethoscope or an ultrasound, and an in-person assessment.
“Our work can empower people to use the smartphone as an additional tool for accessible health evaluation.”
As with previous research using smartphone cameras to measure respiratory and heart rates, these procedural innovations could be deployed through teleheath, saving the need and time for patients to consult a doctor.
On the eye front, following up on their Automated Retinal Disease Assessment (ARDA) project, which currently screens 350 patients daily for diabetic retinopathy – a complication of diabetes – the company is now looking into detecting diabetes-related diseases from photos of the exterior of the eye, using existing tabletop cameras in clinics.
In a recent joint study with Thailand’s national screening programme, data shows that ARDA is accurate and capable of being deployed safely across multiple regions to support more accessible eye screenings.
“In addition to diabetic eye disease, we’ve previously also shown how photos of eyes’ interiors (or the fundus) can reveal cardiovascular risk factors, such as high blood sugar and cholesterol levels, with assistance from deep learning (a machine learning technique that teaches computers to do what comes naturally to humans).
“We’re working with partners to investigate if photos from smartphone cameras can help detect diabetes and non-diabetes diseases from external eye photos as well.
“We envision a future where people, with the help of their doctors, can better understand and make decisions about health conditions from their own homes,” Dr Corrado shares.
Google is also working on improving maternal health by applying AI to help healthcare professionals conduct ultrasounds and perform assessments.
Ultrasound is safe for use in prenatal care and effective in identifying issues early in pregnancy.
“However, more than half of all birthing parents in low-to-middle-income countries don’t receive ultrasounds, in part due to a shortage of expertise in reading ultrasounds.
“We believe that our expertise in machine learning can help solve this and allow for healthier pregnancies and better outcomes for parents and babies,” he says.