Superfish Aims to Dominate Visual Search, One Product at a Time

We’re awash in images. Human beings take one billion photos every day, and websites of all stripes have billions more. But most of those images vanish into the black hole of the Internet, untagged and unsearchable.

Sure, people can tag each photo with names, locations and other data, but very few take the time to do it. As a result, trying to find a dog that looks like Spot, or all of the pictures of you posted on the Internet, or where you can buy a chair you found on a blog, can be an exercise in frustration. It would be so much easier to start with an actual image—of yourself, of a friend, a suspected criminal, a product—and then quickly search vast numbers of images to find those that are identical or very similar.

Adi Pinhas has been working for years to make that possible. “With visual-search we all can be like kids, pointing to something and ask what is this? Are there any more like that?” he says. “As humans, this is more natural for us.” For the last eight years, he and Superfish, the Palo Alto, CA-based company he founded in 2006, have been toiling to create the combinations of algorithms needed to compare and match pictures.

In 2011, the company launched its first product, Window Shopper, a desktop and mobile app that allows buyers to click on a photo of, say, a pair of shoes on a blog, and then find similar shoes on sites like Amazon or eBay. “You don’t need key words or to explain why you like this shoe,” he says. “You click on an image and it happens. It’s easy for the machine to see what you mean,” Pinhas says.

Next came PetMatch, which lets people upload a photo of the kind of dog they like, then searches adoptable pets on the Internet to help users find a similar pup in need of a home.

Within the next six months, Pinhas expects Superfish to release at least two more apps that use visual search to solve a problem. Depending on user uptake, the company may eventually release a general visual search engine that can find pictures the way Google finds words, phrases, and images. “We will say to users, ‘Show me what you’re looking for, and we will find it,’” says Pinhas.

It’s a highly complex task, and though there are some products out there, no one has perfected image search—not even Google. In 2011, Google made it possible to drop images in their search bar in Google images. But that’s most helpful in finding images that already have related content on the web, like landmarks and paintings. It doesn’t work so well for puppies or photos from your wedding. Two years later, Google added similar capabilities to help Google+ users find their own photos without tagging them. But the company noted in a post announcing the capability that there’s still a long way to go to perfect image search. “Have we gotten computers to see the world as well as people do?” the note asked. “The answer is not yet, there’s still a lot of work to do, but we’re closer.”

So can Superfish do better than Google? The two companies have taken different approaches. For the most part, Google has used image search capabilities to enhance its existing products—Google+ and Search, while Superfish has used its algorithms to create dedicated products.

Aside from the tech giant, there aren’t a ton of companies working on the problem—though Pinhas says he has seen a lot of startups try and fail, and Superfish has managed to make serious strides with the technology and become profitable. Superfish’s general approach: The company’s algorithm splits the query image into thousands of small sections. Each section has distinct features, like a particular pattern, texture, edge, or dot. The algorithm then searches for other images with similar features in similar sections, with about the same distance between them.

When Pinhas first started the company, creating the right algorithms was the biggest challenge. “There wasn’t anything that we could copy, or something slow we had to make fast,” Pinhas says. “We just had to invent them.”

The company had a team of a dozen or so PhDs working for four years to find

Author: Elise Craig

Elise Craig covers technology, innovation and startup culture in the Bay Area. She has worked as a news producer on the breaking news desk of the Washington Post and as an assistant research editor at Wired magazine. She is also an avid freelance writer and editor and has written for Wired, BusinessWeek, Fortune.com, MarketWatch, Outside.com, and others. Craig earned her bachelor’s degree in English from Georgetown University in 2006, and a master’s of journalism from the University of California at Berkeley in 2010.