We had an amazing lineup of speakers at our first-ever forum on the future of robotics at SRI International in Menlo Park back in May. But I was especially excited to have the opportunity to do an on-stage interview with Yoky Matsuoka, whose pioneering studies of “neurobotics” have brought us closer to a future where amputees will be able to use brain signals to control agile, realistic prosthetic limbs.
As a professor of computer science and engineering at the University of Washington—and at Carnegie Mellon University before that—Matsuoka gained fame for her work on “anatomically correct” robot hands. To an almost fanatical extent, Matsuoka’s robot hands mimicked the joints, tendons, and other details of human hands. Only a limb capable of reproducing the full range of human motions, Matsuoka reasoned, could properly interpret the complex neural signals coming from the brain.
Matsuoka’s attack on the problem was innately interdisciplinary, mixing computer science, biomedical engineering, neuroscience, and, of course, robotics. That attracted the attention of the MacArthur Foundation, which awarded her a “genius grant” fellowship in 2007, and won her spots as one Popular Science magazine’s “Briliant Ten,” one of Barbie’s “Top Women to Watch in 2010,” and one of Seattle Magazine’s “Power 25.”
But Matsuoka stayed busy outside the lab too, founding YokyWorks, a non-profit foundation that works to get girls interested in science and engineering by putting them to work on building custom assistive devices for people with disabilities. In our interview, which I recorded, Matsuoka (who is one of our Xconomists) said she went into robotics because she wanted to help people. She said that the goal of YokyWorks is to show middle-school-age girls that they, too, can help people by becoming engineers.
These days Matsuoka is vice president of technology at Nest, the Palo Alto startup building iPhone-like thermostats aimed at changing the way people interact with the environmental systems in their homes. That may seem like an odd career shift, but Matsuoka said in our interview that she sees the Nest thermostat as another kind of robot—one that’s so beautifully designed that it could finally pave the way for many other kinds of ‘bots to enter people’s homes.
Here’s an edited writeup of our conversation.
Xconomy: When you were younger you played tennis quite intensively. In fact you made it to the qualifying rounds at Wimbledon. But I understand you suffered injuries on occasion.
Yoky Matsuoka: On occasion is an understatement.
X: So, do you feel being an athlete prepared you to think about the mechanics of human motion? What’s the thread between that experience, and the work that you did at Carnegie Mellon and the University of Washington on simulating and building anatomically accurate human prosthetic limbs?
YM: When I was playing tennis in college, my selfish motivation was to build a tennis buddy that could play tennis with me. I didn’t really think about how I could help other people with injuries. But that certainly shaped the form of my education in terms of what I ended up doing for my PhD. I went into robotics because I wanted to build a tennis buddy for myself. But then I ended up going off the deep end. We had to understand the neuroscience, or else I could not keep going and build myself a tennis buddy.
By the time I got there and really started thinking about how to build those systems, those crazy five or ten hours a day and hallucinating about which muscles are being activated did come in quite handy. I ended up really trying to come up with a computational model of how the human brain learns different motions. A great analogy would be, when you learn how to play tennis, if the ball bounces in the same spot maybe you can learn how to hit that ball well. But what if the next ball that comes at you bounces very differently? Somehow we come up with a way to improvise it.
Robots are not as good at that. That’s what intrigued me and the computational mechanisms that grow up in the brain [to handle that], that’s what I ended up really focusing on.
So, did it help? Yes. But where the injuries came in, was really understanding the applications of neuroscience. I didn’t realize how many people have neurological injuries that are preventing them from being able to move. And I thought “Wait a second, we are sitting here knowing so many robotic technologies. We can actually utilize this to help them.” That was the turning point.
X: You’ve talked about the scene in The Empire Strikes Back where Luke has a new bionic hand, and there’s that brief moment where you can see the levers in the hand clicking back and forth like tendons. To make something like that work you obviously have to have communication with the brain. So I’m curious—you’re someone who thinks deeply about both halves of the problem, meaning how do you build an anatomically correct robot hand, but also how do you drive it from actual neural impulses. Which of those problems is harder? Or is it the kind of thing where you have to solve them both at once?
YM: We’ve solved them both already right? [Laughter] If you are thinking about the Star Wars scene, and the robot is doing this and it’s super fast and dexterous, that’s a long way from now, in both an anatomically correct mechanical system as well as the brain signals. But we’re making critical headway on both of those things.
Comparing is kind of hard. What are we limited by in terms of the mechanical systems? We are limited by