Come See AllSee, UW’s New Low-Power Gesture Control System

device—regardless of whether it has batteries—with gesture controls. “You don’t have to worry about power if you have this kind of technology for gesture recognition,” Talla says.

Kellogg and Talla came together after working on earlier projects with Gollakota, who heads the UW Networks and Wireless Lab. Computer science professor Ed Lazowska describes Gollakota as “an idea factory.” He and his students have won acclaim for recent research projects including WiSee, which measures Doppler shifts in wireless network signals as a medium for gesture controls, and Ambient Backscatter, which harvests small amounts of power from radio signals for communication devices and is being applied in AllSee. The latest project is “TongueSee: Non-Invasive Tongue Machine Interface.” It is meant to help people with conditions such as tetraplegia and amyotrophic lateral sclerosis (Lou Gehrig’s Disease) control devices using the tongue, Gollakota says.

Mobile phones are obvious candidates for the AllSee technology, but the team envisions broader applications.

An AllSee prototype.
An AllSee prototype.

“Gestures are the most natural way of interacting with objects,” Talla says.

As more objects are equipped with sensors, smarts, and communications capabilities—the foretold technology upheaval broadly referred to as the Internet of Things—controlling them could be a barrier, Talla says. Take Bluetooth-equipped activity tracking products like Fitbit or Jawbone Up, which are essentially small, power-constrained wearable sensors. “Right now, the only way to interact with them is use your phone and send a message,” Talla says. But imagine waving at the start of your run to instruct your fitness monitor to start measuring. “That’s a best-case fit for AllSee devices,” he says.

In a video demonstrating AllSee (the same one in which Kellogg shows off the music controls), Talla summons a small robot that’s behind him with a simple “come here” wave. That raises a question: As we get to a more fully realized Internet of Things, where lots of objects are controlled this way, can the technology distinguish an individual’s unique intent—that Talla was summoning that particular robot, and not trying to turn up the heat on a smart thermostat or signaling a blaring smoke alarm to turn itself off?

“Our technology looks at specific signal changes, so it’s actually very directional,” Talla says. Gestures aimed straight at the robot, in other words, would be received by it more strongly. But he acknowledges that there will need to be improvements, such as increasing the number of gestures the system recognizes and improving specificity with triangulation.

Security is another issue. You don’t want someone to be able to change your music just by waving at your pocket.

Author: Benjamin Romano

Benjamin is the former Editor of Xconomy Seattle. He has covered the intersections of business, technology and the environment in the Pacific Northwest and beyond for more than a decade. At The Seattle Times he was the lead beat reporter covering Microsoft during Bill Gates’ transition from business to philanthropy. He also covered Seattle venture capital and biotech. Most recently, Benjamin followed the technology, finance and policies driving renewable energy development in the Western US for Recharge, a global trade publication. He has a bachelor’s degree from the University of Oregon School of Journalism and Communication.