cloud computing. Microsoft was the first to bring a 3D photogrammetry tool to consumers back in 2008, in the form of a demo system called Photosynth. Using a Windows computer or the Photosynth app on an iPhone, users can snap multiple images of a scene, then stitch them together into explorable, wraparound panoramas.
But Photosynth, which was developed into a Web service by Microsoft’s now-defunct Live Labs, was conceived more as a 3D photo browser than as a tool for generating high-resolution 3D models. The 3D “point cloud” that Photosynth creates by matching common features in multiple images “isn’t particularly useful,” Mathews asserts, because the resolution is too low and up to 20 percent of the matches are bogus.
Autodesk got into photogrammetry in 2008 by acquiring a French company called Realviz, which made high-end 3D modeling tools used mainly for special-effects-laden movies like Superman Returns and Harry Potter and the Goblet of Fire. “We wanted to democratize this down, so we put it into the Labs,” says Mathews.
But what is Autodesk Labs, exactly? Created in 2006, it’s a porous collection of 20 to 50 experts who rotate in from other parts of the 6,800-employee corporation to create software prototypes and previews that can be shared and tested on the Web—from Project Vasari for modeling new buildings in real cityscapes to Project Falcon for simulated wind tunnel testing of vehicle designs. To adapt the Realviz technology, which was rechristened Photofly, the labs assembled a group of Autodesk engineers with backgrounds in big-data analytics, fault-tolerant cloud computing environments, mathematics, and user interfaces. They did some rapid prototyping of their own and released Version 1.0 of Photofly at the TEDGlobal conference in Oxford, England, in July 2010.
“The first thing we did was make it more user-friendly,” says Mathews. “The second thing was that we put all the heavy-duty computation up on the cloud. Photosynth and Realviz would tie up your computer for hours. In the cloud we can split the problem up into the different parts and send them to data centers with different specialties.” Mathews showed me how quickly the process works by sitting me down in a chair and hopping in circles around me with a digital camera, shooting about 50 images and then uploading them to Autodesk’s cloud. The finished mesh in the image above was ready well before we finished our hour-long interview.
The math problem of turning a bunch of photographs into a 3D model is a pretty gnarly one, so moving the computation into the cloud made sense. The challenge starts with comparing the millions of pixels in a pair of images to identify visual features that might correspond—“a freckle here, a corner of a collar there,” as Mathews puts it. Once you’ve got enough common points, you can do a bit of trigonometry to guess where the camera must have been for each shot. The difficulty is that this process must be repeated for dozens or sometimes hundreds of photos, with up to 4,000 points being tracked across the images. “You are building this probabilistic matrix, and the math is very complicated,” says Mathews. But the more images that contain a single feature, the more confidence the software can build up in its guesses.
The end product is a 3D mesh made of thousands of tiny triangles, just like the models in video games or CGI movies. The resolution of the mesh varies depending on the object represented and the distance from which the original photos were taken, but for a human head, each triangle might represent a patch of flesh as small as a few millimeters.
But that’s not the last step. Once the mesh is complete, Photofly goes back to the original images and extracts colors to create a “texture map” that can be laid over the mesh. The program even compensates for differences in lighting and exposure, blending textures so that the model isn’t simply the right shape, but also has a photo-realistic surface.
The finished model can be exported in various formats and used to make movies and fly-throughs. (Just search YouTube for “Photofly” to see about 800 examples.) You can also modify the mesh using programs like Autodesk’s 123D Sculpt. If I were unhappy with the Jimmy Durante nose and the Dumbo ears in the model of my own head, for example, I could do a bit of virtual plastic surgery.
When Autodesk put a Labs team on the Photofly project, it didn’t necessarily have photo-realistic 3D busts in mind. “When we first started doing this, the main use case we had in mind was sustainable architecture,” says Mathews. When you’re renovating an existing building to make it