“The party isn’t exactly over, but the police have arrived, and the music has been turned way down.”
That’s how Peter Kogge, an ex-IBM computer scientist who teaches at Notre Dame, described the state of supercomputing in a 2011 article in IEEE Spectrum. The giant machines that researchers use to simulate things like climate change, protein folding, and nuclear tests aren’t going to keep getting faster at the same rate they have in the past, wrote Kogge, who led a study group on the question for the U.S. Defense Advanced Research Projects Agency, or DARPA.
The basic reason: pushing microprocessors to work a lot faster than they already do will require inordinate amounts of power, and generate an unmanageable amount of waste heat. You could build a computer that runs 100 times faster than Cray’s 1-petaflop Blue Waters machine at the National Center for Supercomputing Applications, but “you’d need a good-sized nuclear power plant next door,” not to mention a huge, dedicated cooling system, Kogge observed.
How do roadblocks in supercomputing relate to the kind of computing that we average Joes do every day—sifting through e-mail, posting photos on Facebook, maybe playing a few video games?
Well, it used to be that advances at the biggest scales of computing heralded near-term benefits for consumers. Consider that the ASCI Red supercomputer at Sandia National Laboratories, built in 1996, had a peak speed of 1.8 teraflops (trillions of floating power operations per second). ASCI Red required 800,000 watts of power and took up 150 square meters of floor space. Just 10 years later, thanks to steady advances in transistor miniaturization, Sony’s PlayStation 3 could hit the same speed (1.8 teraflops) using less than 200 watts of power, in a box small enough to fit under your TV.
But that, unfortunately, is where the express train stopped. The clock rates of commercial microprocessors peaked at about 3 gigahertz back in 2006, and haven’t advanced at all since then.
Obviously, there have been other kinds of advances since 2006. Engineers have figured out how to put more processing cores on each chip, while tweaking them to run at lower power. The dual-core A5X system-on-a-chip, designed by Apple and manufactured by Samsung in Austin, TX, is the epitome of this kind of clever engineering, giving the iPad, the iPhone, and the iPad Touch the power to run mind-blowing games and graphics while still providing all-day battery life, all at roughly 1 gigahertz.
But the uncomfortable truth weighing on the minds of innovators is that Moore’s Law has expired, or will very soon.
Moore’s Law was never a real physical law, of course, but merely a prediction, first ventured by Intel co-founder Gordon Moore back in 1965. It says that the number of transistors that chipmakers can squeeze into a microprocessor will double every 18 to 24 months, without adding to the device’s size or cost.
The prediction held true for about 40 years, but now manufacturers are falling behind. Between 2009 and 2012, Intel improved the performance of its CPUs by only 10 or 20 percent per year—way behind the 60-percent-per-year gains needed to keep pace with Moore’s original forecast.
Though no one in Silicon Valley likes to talk about it, the “easy” years for the semiconductor industry are over. Transistor gates are now only a few atoms wide, meaning they can’t be shrunk any further without losing track of the electrons flowing through them (it’s a quantum mechanics thing). In other words, Intel, AMD, and their competitors won’t be able to make tomorrow’s chips faster, smaller, and denser without fancy tricks, such as 3D circuit designs, that will likely make future generations of computing devices sharply more expensive to manufacture. So even if they can find ways to stick to the letter of Moore’s Law, they’ll be violating its spirit, which was always really about economics. (In Moore’s own words in a 2006 retrospective, “integrated circuits were going to be the path to significantly cheaper products.”)
Let’s say I’m right, and the single most powerful technology trend of the last half-century—the one driving all sorts of other exponential advances, in fields from telecommunications to robotics to genomics—has reached its endpoint. What would that really mean, from a consumer’s point of view?
Not very much; not enough to cause panic in the streets, at any rate. It’s not as if we’ll suffer a sudden dropoff in GDP or productivity or life expectancy. While the effects of the “Moorepocalypse,” as some have called it, will be noticeable, they won’t be catastrophic. That’s because there are important frontiers where computer scientists can make progress without having to wait for transistors to get even smaller—and where a few breakthroughs could be extremely meaningful to consumers.
I’ll detail a few of them in a moment. But first, let’s acknowledge that some pain is on the way. A slowdown in chip advances will have real repercussions in the market for desktops, laptops, tablets, and smartphones, where we’ll probably have to wait a lot longer between big upgrades.
In the struggling desktop market, this pattern has actually been evident for some time. For many people, PCs reached the “good enough” stage in the mid-2000s, as PCWorld columnist Brad Chacos has noted. There are lots of consumers who own personal computers mainly so that they can surf the Web, play Solitaire, e-mail photos to their friends, or open an occasional spreadsheet—and for them, hardware makers haven’t offered a lot of compelling reasons to chuck the old tower PC from Dell, HP, or Gateway. (My parents got along fine with a 2000-vintage Windows XP machine from Gateway until this summer, when my brother and I finally talked them into getting a MacBook Pro.)
On the mobile side, “there has not been a ‘must have’ new device for quite some time,” as consultant and columnist Mark Lowenstein argued just this week. That probably helps to explain the fact that smartphones aren’t selling as well as they used to in developed countries. “Fact is, any mid-tier or better smartphone in the market today is pretty fabulous. It does just about anything you would want it or need it to do,” Lowenstein correctly observes. Yet another fact: phones can’t be made much thinner, lighter, faster, or brighter without sacrificing battery life. So it’s hard to see what types of hardware innovation will send average cellular subscribers running back to the Verizon, AT&T, or Apple stores.
But while a slower hardware replacement cycle may cut into profits for PC and handset makers, it’s hardly the end of the world. Let’s say microprocessor speeds do level off exactly where they are today; there is still plenty of room left for other types of improvements in the computing experience for consumers. One might even argue that a pause on the hardware side would allow software engineers to