IDAvatars, Working With Watson, Seeks to Answer Patients’ Questions

Watson, meet Sophie.

In September, IBM (NYSE: [[ticker:IBM]]) announced 100 of its development partners on its Watson supercomputer had brought a product or service to market. Among them was iDAvatars, a Mequon, WI-based startup that creates avatars like Sophie, a virtual medical assistant who becomes more familiar with a user with each interaction.

Norrie Daroga, who founded iDAvatars in 2013 and remains CEO, says nearly all of the 100 IBM partners are focused on “content.” By that, he means they first send Watson information to leverage the system’s ability to make sense of huge datasets. And second, they receive Watson-provided insights, typically in text or chart form, which are used by the source app or service.

By contrast, iDAvatars—originally known as Geppetto Avatars—is focused on user experience, Daroga says.

“Watson has a tremendous amount of content, provided by a lot of people,” he says. “Our product sits on top of that. When content is retrieved, our technology allows users to interact and ask questions. That’s of considerable interest to organizations that have this content capability.”

Sophie can ask and answer questions, record responses, and pass along information to other systems. For example, patients who receive care from the U.S. Department of Veterans Affairs (VA), one of iDAvatars’ customers, can chat with her online about diabetes management or symptoms of post-traumatic stress disorder. In August, the company was awarded an $800,000 subcontract to design “intelligent digital avatars” for the VA.

Daroga says the VA is an especially good candidate for online patient engagement software, because many veterans live dozens or even hundreds of miles from the health system’s nearest location.

VA patients will be able to interact with two different animated characters, says Daroga. The first is a “general knowledge avatar” that can provide information about scheduling or hospital and clinic locations. The second is Sophie, clad in a white lab coat with a stethoscope draped from her neck. She’s designed to field questions about specific diseases and conditions, at times drawing on Watson to find answers.

Daroga says the VA gave iDAvatars about 2,000 answers to inquiries about health problems, which required Sophie to be able to understand up to 30,000 different questions that should trigger one of the answers. “There might be 14 ways to ask a question, but they should all receive the same answer,” he says.

The avatars not only can track what users say, but also how they’re saying it. Daroga says iDAvatars connects to Watson using application programming interfaces, one of which can parse the tone of a verbal exchange. In addition, Sophie uses the cameras on a user’s computer, phone, or tablets to assess his or her mood.

Also under the hood of the software are artificial intelligence, data analytics, and natural language processing capabilities, Daroga says. “It’s at least six technologies that you have to have experts in.”

The company built its own voice-to-text module from scratch, despite IBM’s

Author: Jeff Buchanan

Jeff formerly led Xconomy’s Seattle coverage since. Before that, he spent three years as editor of Xconomy Wisconsin, primarily covering software and biotech companies based in the Badger State. A graduate of Vanderbilt, he worked in health IT prior to being bit by the journalism bug.