The recent revision of U.S. economic growth for the January through March period had the economy shrinking by 1 percent after an adjustment for inflation. Adding expected population growth to that figure would reveal the standard of living dropping by 2 percent at an annual rate over that period. Part of this bad news is that perhaps as much as 1 percent is attributable to the bad winter weather. Still, it is bad news of the sort that should have us thinking deeply about our growth problem and its implications.
Since the end of World War II, the U.S. has seen a remarkable period of economic growth in which measured GDP, population and standards of living have grown at faster than a 2 percent annual rate. Indeed, for much of the nation, this has been the norm since shortly after the Civil War. This has led us to feel for some time that one generation will be much better off than the last. There is some troubling evidence that might be changing.
One way to think about this is to count the number of quarters in which GDP grew at a blistering pace, let’s say more than 4 percent annual rate in a quarter. In the 1950s and 1960s we saw 20 quarters each of growth above 4 percent. In the 1970s and 1980s we had a dozen and 13 quarters respectively with growth over 4 percent, but we had an even better 1990s with 15 quarters of rapid growth. The 2000s saw only five rapid growth quarters, and this decade has had two thus far, putting it on pace for a replay of the last decade. Something appears to be happening.
This rapid growth metric has its shortcomings, not least of which is that there is less volatility in the economy. Still, fewer rapid growth quarters implies less buoyant consumer and business confidence of the type that really boosts growth. It might also mean that higher average growth rates are more difficult to achieve due to structural changes in the economy that have to do with the absorption of technology.