the elements designed to attract the most attention—the “Product Tour” or “Order Now” buttons, for example—are performing as hoped. (More on that in a moment.)
Krausz and Gershenson are both 2009 graduates of Carnegie Mellon University. After school, Krausz went to work for Newton, MA-based TripAdvisor, helping to build the company’s recently introduced airfare search tool, while Gershenson went on for a master’s degree in computer science. But the strange magnetism of the startup world pulled them back together.
“I do a lot of things with Internet usability and Web 2.0 stuff, and Joe has amazing skills with these involved algorithms,” says Krausz. “Being good friends, we said ‘There has to be a way to combine these skills to make a successful company.'”
But there was no grand plan. The pair didn’t decided to leave their old posts and launch GazeHawk until after they’d been accepted to Y Combinator, where they’d applied “almost on a whim,” Krausz says. (Getting in “made the decision process on whether to start a company much easier, since it gives you the biggest boost for success I can think of short of being friends with Ron Conway or someone like that.”)
The central insight that Krausz and Gershenson are trying to bring to the Web usability market is that eye tracking data doesn’t have to be perfect to be informative. Older technologies like infrared-based bright pupil systems are expensive in part because they measure pupil angles down to single-pixel resolution, and in part because they work in real time, building a heatmap as the user takes in a page. “The focus has really been on trying to get that real-time, perfect result,” says Gershenson. “But the important thing in this business is not to know exactly which pixel [testers] were looking at in every split second; it’s to answer the question ‘Are my customers looking at this or not?’ We think it’s possible, with current technology, to build a system that answers that question well.”
The way Gershenson explains it, GazeHawk’s system starts by recording webcam video as a tester’s eyes follow a moving red dot on the screen. It’s a way of sampling what the tester’s eye looks like to the camera when he or she is looking at different parts of the screen. During an actual study, determining where testers are looking is a matter of comparing the new pupil angles to the known positions and calculating the difference.
That’s a bit of an oversimplification, of course. In practice, there’s a lot more going on: Gershenson developed machine learning algorithms, for example, that help eliminate the effect of