The white heat of technological progress can be blinding. It has taken less than a lifetime to go from the birth of computing to the first self-driving cars, not to mention the stream of game-changing breakthroughs in science and medicine in between. Yet these high-profile successes mask a problem. Since 2005, annual U.S. total factor productivity growth (which measures the efficiency with which labor and capital are used) has averaged around 0.5 percent, down from an average of around 1.75 percent from 1996 to 2004. That has hurt economic growth, which remains sluggish nearly a decade after the end of the Great Recession. 

The slowdown has sparked a debate among economists over the sources of the problem. Are statisticians mismeasuring—and thus underestimating—output? Is the United States mired in “secular stagnation”—a prolonged period of low economic growth caused by too much saving and too little investment? Or are recent innovations simply not as productive for society as those of the past?

Not long ago, I was among the economists who took a relatively optimistic view toward declining productivity growth. In 2016, I publicly caricatured the ways economists have often interpreted swings in U.S. productivity growth over the past half century. The declining rate, I argued, did not necessarily reflect a long-run trend of slow productivity growth. I attributed it instead to a temporary effect of the global financial crisis and anticipated a turnaround coming down the pike.

Since then, my perspective has changed, swayed by work I carried out with three fellow economists and published in a paper last year. We found that research productivity, or idea productivity—one of the fundamental components of the U.S. economic engine—has been falling for decades. That means that ideas—scientific discoveries or technical advances—are getting harder to find and that innovation is costing more than ever before.

DOING LESS WITH MORE

The creation of ideas is central to a growing economy. Adding more workers or machines boosts GDP, but the only way to grow per capita incomes is to get more out of the same quantities of labor and capital. That means finding more efficient ways to use them. In many models of economic growth, the long-run growth rate is the product of two things: the effective number of researchers and the research productivity of those people. The more people there are innovating and the more ideas they produce, the faster the economy grows. But in practice, our empirical analysis found an underlying imbalance between the two factors. Research and development efforts have been rising steeply for decades while research productivity—the number of ideas being produced per researcher—has fallen rapidly.

The analysis revealed that more than 20 times as many Americans are engaged in R&D today as were in 1930, yet their average productivity has dropped by a factor of 41. The only way the United States has been able to maintain even its current lackluster GDP growth rate has been to throw more and more scientists and engineers at research problems. The U.S. economy has had to double its research efforts every 13 years just to sustain the same overall rate of economic growth. 

Just because research productivity is declining at the aggregate level does not mean that the same must be true for individual products and industries. To find out whether the national data mask more optimistic stories, we looked at how difficult it is to find ideas within three swaths of industry: technology, medical research, and agriculture. For another measure, we analyzed research efforts at publicly traded firms. Everywhere we looked, there was clear evidence that rising investments in R & D have masked declines in underlying productivity.

A Waymo self-driving car undergoing testing in Los Altos, California, November 2017.
Dllu / WIKIPEDIA

Take technology. Since around 1971, the density of transistors on an integrated circuit has doubled roughly every two years, a phenomenon known as Moore’s law, after Gordon Moore, the co-founder of the computer chip giant Intel. The resulting advances have enabled the creation of ever more powerful computers, which have affected every aspect of society. But over 18 times as many researchers are required to maintain that regular doubling today than were in the early 1970s.

Other industries exhibited similar falloffs. To measure productivity in agriculture, we compared the crop yields of corn, soybeans, wheat, and cotton with the amount of money spent on improving those yields by developing new methods of crossbreeding, bioengineering, crop protection, and maintenance. The average yield per acre for all four crops roughly doubled between 1960 and 2015. But the amount of research necessary to achieve those gains rose by factors of between three and 25, depending on the crop and the specific research measure. On average, we found that idea productivity in agriculture fell by about four to six percent each year.

A similar pattern shows up in medical research. We examined decreases in the mortality rates of cancer patients and compared them with annual numbers of medical research publications and clinical trials. Looking back to the 1970s, we found that idea productivity for cancer research rose until the mid-1980s and then began to fall, suggesting that at least in some research areas, such as breast cancer, finding new ideas becomes easier for a time but gets harder in the end.

Overall, between 1975 and 2006, the ratio of extra years of life saved among cancer patients to the number of clinical trials fell by an average of five percent per year. More and more trials, that is, were necessary to save the same number of extra life years. Narrowing down to breast cancer alone, the average annual decline in research productivity was an even starker ten percent. And for heart disease, another leading cause of death in the United States, the figure was seven percent.

To get a broader sense of research productivity throughout the economy, we examined the track records of U.S. publicly traded companies. Our analysis of data from 15,128 firms between 1980 and 2015 showed that even as spending on R & D rose, a vast majority of the firms experienced rapid declines in productivity, as measured by the return on research investment in growth in sales, market capitalization, employment, and revenue earned per worker. On average, idea productivity fell at a rate of about nine percent per year. As a result, the average firm today needs 15 times as many researchers as it did 30 years ago to produce the same rate of growth.

THE FUTURE AIN’T WHAT IT USED TO BE

That research productivity has declined steadily over the past several decades is clear; why this has happened is a harder question. To understand what’s going on, it’s worth looking back to 1760, the start of the Industrial Revolution in Great Britain. Before then, productivity growth in Great Britain was abysmally low: less than 0.1 percent per year. In 1700, most of the British population still worked on farms and were not much more productive than their ancestors had been under the Romans 2000 years earlier. But from the late 1700s until about 1950, productivity growth accelerated. This was the era when, in the words of Isaac Newton, inventors saw further by “standing on the shoulders of giants.” Each new invention, such as the steam engine, electric lighting, and penicillin, made future inventors more productive. This phenomenon was driven by the increasing number of firms that created industrial R & D labs, beginning with Thomas Edison’s labs in 1876, while universities focused more on research in science and engineering. By 1950, however, the tide began to turn. Annual productivity growth in the United States peaked at around three percent and the third phase—the “apple tree model”—was setting in. Humanity had made many of the quickest discoveries—picked the lowest-hanging apples—and so unearthing new scientific truths started getting harder. 

What about the future? Recent history, rather than science fiction, tends to be the best predictor of the near future. The next ten to 20 years will likely look similar to the recent past: annual productivity growth will hover around one percent. Compared to the mid-twentieth century, that is slow, but compared to the long sweep of history it is a blistering rate of technological progress. Looking further ahead, technology growth is harder to predict. The world has undergone at least two major shifts in productivity growth since 1700—and the information age could eventually bring another one.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and over a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print and online, plus audio articles
Subscribe Now
  • NICHOLAS BLOOM is William D. Eberle Professor of Economics at Stanford University and a Senior Fellow at the Stanford Institute for Economic Policy Research.
  • More By Nicholas Bloom