Leading the Way to Health Equity

''''

Excerpted from the story, "Leading the Way to Health Equity" published on UC San Diego Today Dec. 6, 2022. 

Developing a virtual human avatar that answers medical questions

People who are isolated, have limited mobility or live in medically underserved communities without ease of access to clinicians often turn to Google or search engines when they have questions about their health conditions, medications, mild symptoms or other health-related topics. But search engines often fail to provide comprehensive medical answers due to the distributed nature of the web. A person might have to visit multiple websites to arrive at a full answer, Ndapa Nakashole, an assistant professor of computer science in the Jacobs School of Engineering at UC San Diego, has found. It can prove time-consuming and overwhelming for a nonexpert to sort through hundreds of pages of search results or comprehend a long and complex answer teeming with medical jargon.

That’s why Nakashole and a collaborative team of UC San Diego researchers and students are working to develop a 3D virtual human avatar—viewable on a computer or mobile app—which answers medical questions by assimilating, synthesizing and storing health information in a broad-coverage resource with a shared vocabulary. In emulating an in-person doctor-patient interaction, the “bot,” which also includes a social mode for companionship, can provide supportive visualizations by highlighting certain parts of the human avatar’s body that are relevant to a particular answer.

This work—which Nakashole plans to expand into other languages that will reach even more underserved communities around the world—has the potential to transform the health of individuals with otherwise limited access to medical care. On the back end, it requires building text generation models that can create fluent, coherent and contextually relevant responses, which is one of Nakashole’s areas of expertise. The research team will also need to develop natural language processing systems—the artificial intelligence models used to train computers to understand spoken and written human language—to translate long, complicated medical questions into versions understandable by the digital avatar.

“If there’s a tool that can aggregate information in providing answers that are comprehensive, that can be really helpful and save people time,” said Nakashole. Her work in artificial intelligence, machine reading and natural language processing first intersected with the healthcare domain four years ago when she and a team of researchers developed a voice assistant, similar to Amazon’s “Alexa” or Apple’s “Siri,” designed specifically to understand and answer the medical questions of older adults—who often are more isolated and tend to have health problems.

In May, Nakashole was awarded a Faculty Early Career Development (CAREER) award by the National Science Foundation to fund the next five years of this work through her project entitled “Informational Extraction and Integration with Applications to Healthcare Question Answering.”

Nakashole and her team are currently working with a medical illustrator to develop the 3D virtual human avatar, and in the future plan to collaborate with robotics experts to create a physical human avatar as an interface for medical question answering. She sees her work as a unique opportunity to make an impact on society and reduce the disparities that are prevalent among certain populations.

“If you can make a difference in people’s lives, that adds much more meaning to the work and makes it that much more worth it,” Nakashole said.