CrowdOptic Taps Smartphones to Track the Crowd’s Attention

The idea behind CrowdOptic, a service that allows event organizers to figure out what a crowd is looking at by tapping their smartphones, came to CEO and co-founder Jon Fisher last year during a routine meeting. As he sat down to chat with an investor in one of his earlier companies, he caught sight of two “tombstones”—the glass sculptures that memorialize deals between companies. He noticed that his chair and the two tombstones formed the three points of a triangle, prompting him to think about triangulation, focus, and sightlines. “That’s really where the inspiration came from for triangulating between smartphone users to isolate action,” Fisher says. “And that really formed the basis of the core technology, the notion that we can record smartphone attributes in real time and compare them with other users in real time.”

With this geometrical insight in mind, Fisher (pictured above) and co-founders Jeff Broderick, Doug Van Blaricom, and Alex Malinovsky set out to build a service that would give event organizers a way to monitor a crowd’s focus at live events. It’s a kind of outdoor equivalent of the eye-tracking studies Web designers run to see what parts of a Web page are attracting attention.

CrowdOptic's software assesses where users' smartphones are pointed to determine where audience members' sight lines are converging.

It works like this: event organizers partner with San Francisco-based CrowdOptic on an event like a tennis match. Match attendees download an app—either CrowdOptic’s app or one specifically attached the event. As they take photos and videos, tweet, and share information, CrowdOptic can draw on data from the compass and accelerometer in each smartphone to determine where the person holding the phone is looking—at one player or another, at the scoreboard, or even at certain ads in the stadium.

“We’re simply tunneling in through an existing process,” Fisher says. “We’re not envisioning everyone in the stands having phones in the air. People are naturally taking pictures and video.” Taken together, the data can show event organizers where the crowd’s collective attention is focused. In an event like a tennis match, there might be two main clusters of focus, one on each player, whereas a baseball game could have far more. (There’s a cool animated version of the graphic above at CrowdOptic’s website.)

With this information, organizers can rethink their ad sales strategy. In online advertising, sales offices can use eye-tracking studies and other forms of analytics to inform their pricing structure and explain it to potential customers. In live events, it’s hard to say which ads are viewed most. It may be a safe bet that most people in a given stadium are going to see ads on the Jumbotron, but how many fans will see an ad in right field or above a stadium box? “If one asset is being viewed five times more, you can charge five times more. You might think banner behind home plate is the best asset, but what’s the third best asset? We’ll be able to prove that data,” Fisher says. “It’s a whole new advertising form.”

Organizers can also hyper-target advertising or discounts directly to users’ smart phones. If a fan at a racetrack is focusing his attention on a given car, the driver’s sponsor might want him to see ads for their products, as a kind of augmented-reality overlay. Or they can choose to build in added benefits for fans, like providing up to the minute stats when someone focuses his phone on a given player. “If you’re watching a baseball game, you can get all info about a batter on the Jumbotron. But if he’s in right field, you can get the whole Jumbotron experience right on your phone,” says Fisher. Organizers can also use the information to retool their TV broadcasts, giving viewers at home an experience closer to what they’d get in the stadium.

The technology has security applications as well. Though CrowdOptic can’t use its apps to pinpoint where a given person is sitting in a stadium or who that person is—they can determine only the sightline of a given phone—the system can detect changes in

Author: Elise Craig

Elise Craig covers technology, innovation and startup culture in the Bay Area. She has worked as a news producer on the breaking news desk of the Washington Post and as an assistant research editor at Wired magazine. She is also an avid freelance writer and editor and has written for Wired, BusinessWeek, Fortune.com, MarketWatch, Outside.com, and others. Craig earned her bachelor’s degree in English from Georgetown University in 2006, and a master’s of journalism from the University of California at Berkeley in 2010.