Photographing Spaces, Not Scenes, with Microsoft’s Photosynth

move gradually around the object, taking a picture every 15 degrees or so. (Conversely, to make a 360-degree panorama, you should take at least 24 separate photos as you spin in place.) If you’re interested in making your own synths, the Photosynth website has an entertaining video with more pointers.

Apparently I absorbed the video’s lessons well, because when I finished uploading the 300 photos I took of the Christian Science complex last Saturday, Photosynth declared them to be “99% Synthy.” My apartment photos were “95% Synthy” and my Copley Square photos were “92% Synthy.” Before you create your own synths, however, be aware that the program is already proving incredibly popular, and that as result, Live Labs’ servers have been largely overwhelmed by uploads. Each of my synths took about 8 hours to upload and process—and each failed at least once along the way, requiring me to start over. The Photosynth team says it’s busy optimizing the system to cope with the unexpected onslaught.

When you’re exploring your finished synths in the Photosynth viewer, there’s a hidden shortcut—the “P” button on your keyboard—that reveals something truly novel and jarring: the “point cloud” that makes up the 3-D scaffolding behind the individual images in a synth. The points represent tiny patches of color or texture that, in Photosynth’s judgment, are shared across multiple images. (This is actually the key to producing a 3-D effect: by comparing images in which the same details are depicted from multiple angles, the software is able to infer the existence of 3-D structures and render them in space. It’s not so different from the way our brains create stereo, 3-D views from the binocular images captured by our eyes. Technology Review‘s March/April issue includes a good explanation of the whole process behind Photosynth, which was conceived by University of Washington graduate student Noah Snavely and built into a workable Web-based system by Snavely and Microsoft programmer Blaise Agüera y Arcas.)

Coffee table point cloudThe more times an object appears in a synth, the denser that object’s point cloud will be. For example, on the coffee table in my apartment, I have two handmade ceramic bowls sitting on a Southwestern-style mat. I took a bunch of pictures of the table while I was preparing my synth, and in the finished point cloud, the mat and the bowls appear like a bright little galaxy of orange and blue stars.

The limitation of Photosynth is that once you’ve spent some time viewing your synths or others’ and exploring all the pretty point clouds, it’s not clear what to do next. (As the Technology Review article’s headline asked, “It is dazzling, but what is it for?”) It would certainly be a useful tool for learning your way around places that you intend to visit—why not tour one of the 200 synths already available for Paris, for example, before your next trip to the City of Light? In fact, Live Labs leader Gary Flake told TR that Microsoft may attempt to integrate Photosynth into the company’s virtual-globe program, Microsoft Virtual Earth, as a kind of shortcut to creating a 3-D metaverse.

The actual coffee tableBut as a tool for individual photographers, Photosynth is still in its earliest stages. For now, you can’t edit or add to existing synths. You can’t assist the program by manually placing the orphaned photos it couldn’t recognize into the jigsaw. You can’t export the 3-D data, as architects who use CAD drawings or virtual-world builders might wish to do.

I’m sure that there are a few artist-geeks out there dreaming up surprising and informative ways to use their allotted 300 photos (uploading more images than that tends to stymie the program). But I think the real significance of Photosynth is that it’s helping to touch off the next major shift in photography: from 2-D to 3-D. In short, it’s now possible to document not just scenes but spaces. And no matter what kind of image-processing software is out there on the Web, most photographers will need more time to get their heads around that idea.

For a full list of my columns, check out the World Wide Wade Archive. You can also subscribe to the column via RSS or e-mail.

Pages 3, 4, and 5 of this article contain embedded versions of the synths of my apartment, Copley Square, and the Christian Science complex. Please note that you can only view these synths on a Windows computer. If you have not installed the Photosynth browser plugin, you will be prompted to do so; please follow the onscreen instructions. Mac users—sorry!

Author: Wade Roush

Between 2007 and 2014, I was a staff editor for Xconomy in Boston and San Francisco. Since 2008 I've been writing a weekly opinion/review column called VOX: The Voice of Xperience. (From 2008 to 2013 the column was known as World Wide Wade.) I've been writing about science and technology professionally since 1994. Before joining Xconomy in 2007, I was a staff member at MIT’s Technology Review from 2001 to 2006, serving as senior editor, San Francisco bureau chief, and executive editor of TechnologyReview.com. Before that, I was the Boston bureau reporter for Science, managing editor of supercomputing publications at NASA Ames Research Center, and Web editor at e-book pioneer NuvoMedia. I have a B.A. in the history of science from Harvard College and a PhD in the history and social study of science and technology from MIT. I've published articles in Science, Technology Review, IEEE Spectrum, Encyclopaedia Brittanica, Technology and Culture, Alaska Airlines Magazine, and World Business, and I've been a guest of NPR, CNN, CNBC, NECN, WGBH and the PBS NewsHour. I'm a frequent conference participant and enjoy opportunities to moderate panel discussions and on-stage chats. My personal site: waderoush.com My social media coordinates: Twitter: @wroush Facebook: facebook.com/wade.roush LinkedIn: linkedin.com/in/waderoush Google+ : google.com/+WadeRoush YouTube: youtube.com/wroush1967 Flickr: flickr.com/photos/wroush/ Pinterest: pinterest.com/waderoush/