in encyclopedic detail how the standard economic statistics on prices, labor output, and other inputs to GDP actually understate the scale of the improvements. Standard measures of GDP couldn’t capture the intangible benefits of things like better lighting inside homes, the time savings from water and sewer connections (which freed women from spending much of their day carrying water in and out of the house), the arrival of “personal travel” as a pastime once automobiles were ubiquitous, and the extra years to enjoy all of the above that came from rising life expectancies.
Gordon’s favorite measure of the impact of innovation is called “total factor productivity.” It gauges efficiency improvements relative to the number of hours people work and the number of machines they use. By his calculations, total factor productivity increased in the U.S. at an unprecedented 1.89 percent per year between 1920 and 1970. It declined to one-third that level, 0.57 percent per year, from 1970 to 1994. It bounced back up slightly, to 1.03 percent per year, during a brief window from 1994 to 2004, probably as the result of the dot-com wave of digitization and networking. And since 2004, it’s hovered at a comparatively anemic 0.40 percent per year.
But Gordon does not think that the U.S. economy broke in some fundamental way after 1970. After all, efficiency improvements of 0.40 percent per year look pretty good when you realize, as Gordon notes, that there was virtually no growth in the West between Roman times and the beginning of the First Industrial Revolution around 1750. Rather, his argument is that the big mid-20th century gains came from transitions that, by definition, could happen only once: rural populations became urban, cars and tractors replaced horses, homes and factories were electrified, antibiotics conquered most infections, infant mortality was vastly reduced, telephones and radio ended isolation, the nation was blanketed with superhighways, and transcontinental and intercontinental jet travel became a reality.
To Gordon’s eye, these fruits of the Second Industrial Revolution (the one that began with electricity and the internal combustion engine) simply outweigh the benefits of what he calls the Third Industrial Revolution (the one focused on information and communications technology, from television to the iPhone). And the reason productivity gains are slower today is that the signature innovations of our era play out in a more limited sphere.
As MIT economist Robert Solow famously quipped in 1987, “We can see the computer age everywhere but in the productivity statistics.” The explanation for Solow’s paradox, Gordon argues, is that “computers are not everywhere. We don’t eat computers or wear them or drive to work in them or let them cut our hair. We live in dwelling units that have appliances much like those of the 1950s, and we drive in motor vehicles that perform the same functions as in the 1950s, albeit with more convenience and safety.” (Emphasis added.)
Gordon musters enough detail to make his book devastatingly convincing. I say “devastatingly” because one implication of his work is that we may be doomed to permanently slow growth. If Gordon is right that the gains from the Second Industrial Revolution were a one-time-only event, then future generations probably won’t have a standard of living notably higher than ours.
That would be discouraging. But there’s one more important point to note about the book. The big gains in total factor productivity weren’t evenly distributed across Gordon’s “special century.” In fact, they show a huge spike, to nearly 3.5 percent per year, in the decade 1940-1950.
Why that decade in particular? All the evidence points to two interrelated causes. The first was America’s involvement in World War II, from 1941-1945, when “the entire economy converted to a maximum production regime in which every machine and structure was used twenty-four hours per day if enough workers could be found” and when “all the indexes of output, hours of work, and productivity soared.”
The second was the post-war consumer boom starting in 1946, when “the floodgates of demand were let loose, and after swift reconversion, manufacturers strained to meet the demand for refrigerators, stoves, washing machines, dryers, and dishwashers, not to mention automobiles and television sets.” Manufacturers were able to meet this demand because they had purchased staggering amounts of modern factory equipment during the war, mostly on the federal government’s dime, and because “the lessons learned from the war translated into permanent efficiency gains after the war.”
Thus the war not only brought forth technologies such as rocketry, computers, radar, jet engines, and atomic energy that would otherwise have taken decades to arrive; it fostered a culture of improvisation and continuous improvement and reset the whole economy at a higher level. In fact, without the “special decade” of 1940-1950, Gordon’s “special century” would look a lot less special. “The most obvious reason why productivity remained high after World War II, despite the end of the military emergency, is that technological change does not regress,” Gordon writes. “People do not forget.”
Which brings me back to Bill McKibben. His New Republic piece can be summarized much more briefly. It argues that the effort to slow climate change is a world war—literally, not metaphorically—and that we are losing it by failing to mobilize on the required scale.
“Winning” this war would not mean averting significant warming. It’s already too late for that. But if we can bring carbon dioxide levels in the atmosphere below 350 parts per million by 2100, the planet would probably stop heating up, at least according to predictions from McKibben’s favorite engineer, Mark Z. Jacobson, director of Stanford’s Atmosphere and Energy Program.
To do that, the U.S. would need to get 80 percent of its energy from non-carbon-emitting sources by 2030 and 100 percent by 2050. Reaching that goal, according to Jacobson’s research, would mean building enough solar panels and wind turbines to produce about 6,500 gigawatts of electricity. And to do that, we’d need to build about 300 solar panel factories as big as the $750 million “gigafactory” that Elon Musk’s SolarCity is building right now outside Buffalo, NY—plus a few hundred more factories to build wind turbines.
That means building 45 new gigafactories every year, starting immediately. Other industrialized countries would need to undertake similar efforts, in proportion to their own energy needs. Sounds daunting, but there is a precedent for this scale of effort in the U.S.—you guessed it, World War II.
Building dozens of new factories per year is the sort of emergency program that only businesses could accomplish and only the federal government