Aaron Marcus, Bard of User-Centered Design, Battles “High-Order Crap”

Aaron Marcus, President and Principal Designer/Analyst, Aaron Marcus and Associates

create order simply by reducing every function to a pictograph. “You had all these engineers creating sign systems with no background at all in visual communications,” Marcus says. “I encountered one of these [Computervision] guys who said with great excitement, ‘We are up to 15,000 icons now!’”

There’s nothing wrong with order, Marcus emphasizes, or with the impulse to impose it through visual metaphors. The trick is to make sure there’s a system behind these signs. For a more modern example of how the process can go wrong, “You have only to open a Mac or Windows operating system, with their jumble of icons,” Marcus says. “It’s the result of market forces that have allowed every individual provider of a product to design their own icon, so that we stumble when we try to find them. It’s as if every letter of the alphabet was sponsored by a different company.”

What’s puzzling, Marcus says, is that many designers of digital interactions today don’t seem to have absorbed lessons learned long ago by their colleagues in the larger world of information design and graphic design.

“If you look at the international standards for mass transportation design, there are some very good systems that have been created to help guide people through a complex airport or subway system,” he says. “Having learned a little bit, you are able to understand the rest, because it’s a rational system approach.”

Humankind Inventing Its Future Self

Marcus, a native of Omaha, NE, began to develop his own design approach in the 1960s. As a physics undergraduate at Princeton, he wanted to study quantum mechanics and gravitational theory, but kept getting steered into more practical areas like laser research. After college he decided to break with science and apply to art schools; he got into the graphic design department at Yale’s School of Art and Architecture.

There, he says, “My brain snapped. I didn’t know what anyone was talking about when they said things like ‘That works!’ or ‘Let the progression of color be more systematic.’”

It took Marcus about six months to learn the language. “I began to understand that it was as if I were still in a physics lab, doing experiments and relating those to first principles and systems of thought and noting down paradigms,” he says. He also learned Fortran and, as a summer researcher at AT&T’s Bell Laboratories in 1967, began experimenting with ASCII art and other forms of computer graphics.

That summer brought Marcus his first experiences with a big computer—a GE 635 mainframe. “I would walk into the room with the raised floor, the eternal quiet hum of the CPU fans, and realize I could see nothing around me but the computer,” he says. “I knew this was humankind inventing its own future self.”

Armed with a new respect for computers, Marcus returned to Princeton, where he taught visual communication, information design, and computer graphics in the School of Architecture and Urban Planning from 1968 to 1977. He also took up consulting work. Between 1969 and 1971, he worked on a prototype page-layout system that would allow AT&T to display the Yellow Pages on its Picturephone. “I was already doing user-centered design, because I was studying myself as a graphic designer, figuring out how I worked and what people like me would need,” he says.

In 1979 Marcus moved west to become a lecturer in the College of Environmental Design at the University of California, Berkeley, and then a staff scientist at Lawrence Berkeley Laboratory. He founded Aaron Marcus and Associates in 1982 and soon won a three-year DARPA grant to come up with ways to improve the usability of the C programming language, first developed at Bell Labs. Simply by reorganizing the code and its presentation on paper and on screen, he was able to help C programmers increase their comprehension by 20 percent, as measured by an independent human-factors testing group.

Still, before Apple, Microsoft, and other companies ushered in an era of truly personal computing, Marcus’s early work was reaching only the specialized few who had access to expensive machines. “As a visual designer, I had to spend $100,000 just to draw a line,” he says. It’s no joke: the special 300-dot-per-inch black-and-white vertical display and associated hardware and software that Marcus needed for the DARPA project cost $76,000, and the matching, refrigerator-sized laser printer cost $24,000. The Mac and the LaserWriter made the equipment obsolete within a couple of years. As Marcus puts it, “When Apple came along in 1985 and we tried to get rid of the equipment, I couldn’t even give it away.”

Slowing Down to Think

It was a key moment: As computers started to show up on every desk, companies realized they needed to find better ways to communicate through software. Marcus had already been pondering the problem for a while, and was ready to help explain the interaction-design process to potential clients in the business and consumer worlds. To understand what Marcus means by user-centered design, it’s worth stepping through his approach.

“The standard levels with which technology deals are data, information, knowledge, and wisdom,” he says. “Data are organizations of significant perceptions—say, all the temperatures in the United States. Information is the organization of significant data.” Those same temperatures displayed on a map, for example.

“Knowledge is the organization of significant information together with action plans. I can have all sorts of weather reports, but do I need an umbrella or not?” he says. “Wisdom, the highest form, is significant patterns of knowledge, combined with either internalized knowledge or real-world experience.” There’s more rain than usual in Bangladesh, say, and less than usual in Nevada—maybe the global climate is changing.

When it comes time to build something real, Marcus says, designers have to start by figuring out which stakeholders will care about the data, the information, the knowledge, or the wisdom. They could be

Author: Wade Roush

Between 2007 and 2014, I was a staff editor for Xconomy in Boston and San Francisco. Since 2008 I've been writing a weekly opinion/review column called VOX: The Voice of Xperience. (From 2008 to 2013 the column was known as World Wide Wade.) I've been writing about science and technology professionally since 1994. Before joining Xconomy in 2007, I was a staff member at MIT’s Technology Review from 2001 to 2006, serving as senior editor, San Francisco bureau chief, and executive editor of TechnologyReview.com. Before that, I was the Boston bureau reporter for Science, managing editor of supercomputing publications at NASA Ames Research Center, and Web editor at e-book pioneer NuvoMedia. I have a B.A. in the history of science from Harvard College and a PhD in the history and social study of science and technology from MIT. I've published articles in Science, Technology Review, IEEE Spectrum, Encyclopaedia Brittanica, Technology and Culture, Alaska Airlines Magazine, and World Business, and I've been a guest of NPR, CNN, CNBC, NECN, WGBH and the PBS NewsHour. I'm a frequent conference participant and enjoy opportunities to moderate panel discussions and on-stage chats. My personal site: waderoush.com My social media coordinates: Twitter: @wroush Facebook: facebook.com/wade.roush LinkedIn: linkedin.com/in/waderoush Google+ : google.com/+WadeRoush YouTube: youtube.com/wroush1967 Flickr: flickr.com/photos/wroush/ Pinterest: pinterest.com/waderoush/