We got a look at one of Microsoft Research’s latest advanced-interface projects a few weeks back, during festivities for MSR’s 20th anniversary. The company actually prohibited anyone from taking images of one demo in particular, for a projected-touchscreen system called OmniTouch, just because the team was going to present it at an upcoming conference.
Well, that conference is starting—and here are the details about OmniTouch. It’s an advancement of something you may have seen before out of MSR, namely this project called “Skinput” that used sensors and projectors to turn forearms and palms into the equivalent of computer or smartphone touchscreens.
OmniTouch was developed by Microsoft’s Andy Wilson and Hrvoje Benko, and Carnegie Mellon student Chris Harrison, who’s also an MSR fellow.
Things are kicked up a notch from the Skinput project. OmniTouch employs a shoulder-mounted projector and Kinect-ish camera, and can recognize multiple nearby surfaces to do tasks simultaneously. So, you could project a main “screen” on a wall or desktop, and pull up your palm to flip back to a navigation menu, to pick a new video, e-book, or webpage.
The hardware is a little bulky in this prototype version, of course (Microsoft officials demurred when I asked at the MSR anniversary event whether there were plans to incorporate a laser blaster, Predator-style).
Here’s one of the videos from Harrison’s page, which also features a second, more technical video demo, the research paper, and more goodies. The Microsoft page also has details about a project called PocketTouch, which would allow users to interact with devices by tapping them while they’re still in a pocket, for instance.