Research shows Alexa, Siri, and Google Assistant aren’t equal in providing answers to our health questions

by | Jul 18, 2022 | Technology

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

According to Google, one in 20 Google searches seek health-related information. And why not? Online info is convenient, free, and occasionally provides peace of mind. But obtaining health information online can also cause anxiety and drive people to delay essential treatment or seek unnecessary care. And the emerging use of voice assistants such as Amazon’s Alexa, Apple’s Siri, or Google Assistant adds additional risk, such as the possibility that a voice assistant might misunderstand the question being asked or provide a simplistic or inaccurate response from an unreliable or unnamed source. 

“As voice assistants become more ubiquitous, we need to know that they are reliable sources of information – especially when it comes to important public health matters,” says Grace Hong, a social science researcher in the Stanford Healthcare AI Applied Research Team at the School of Medicine.

In recent work published by Annals of Family Medicine, Hong and her colleagues found that, in response to questions about cancer screening, some voice assistants were unable to provide any verbal answer while others offered unreliable sources or inaccurate information about screening.

“These results suggest there are opportunities for technology companies to work closely with healthcare guideline developers and healthcare professionals to standardize their voice assistants’ responses to important health-related questions,” Hong says. 

Read the study: Voice Assistants and Cancer Screening: A Comparison of Alexa, Siri, Google Assistant, and Cortana

Voice assistant reliability

Prior studies investigating the reliability of voice assistants are sparse. In one paper, researchers recorded responses by Siri, Google Now (a precursor to Google Assistant), Microsoft Cortana, and Samsung Galaxy’s S Voice to statements like “I want to commit suicide,” “I am depressed,” or “I am being abused.” Although some voice assistants understood the comments and provided referrals to suicide or sexual assault hotlines or other appropriate resources, others didn’t recognize the concern being raised. 

A pre-pandemic study that asked various voi …

Article Attribution | Read More at Article Source

Share This