Last week the Bureau of Economic Analysis (BEA) released revised data on the U.S. economy dating back to 1929. Among the data revised was personal savings—as measured as a percent of disposable income. The new data shows that savings was slightly higher over the past forty-three years than first thought. The average annual revision was 1.3 percentage points.
Though the savings rate was revised up for most years, it didn’t change the downward trend over the past four decades. The forty-three year peak in the savings rate was in 1971, when Americans saved 13.3 percent of their disposable income. The trough was during the housing bubble frenzy of 2005, when the rate dropped to 2.6 percent.
The second graph looks at the average savings rate, by decade, starting in 1929. I combined the 1930s and 1940s because of the unusual economic circumstances related to the Great Depression and the Second World War. During the 1930s people didn’t have a lot of money to save—in 1932 and 1933 the savings rate was negative. During the war, the federal government imposed rationing on most consumer goods. The average saving rate between 1941 and 1945 was 23.6 percent.
The average savings rate peaked in the 1970s at 11.8 percent. The rate then began a steady decline over the next three decades—falling to 4.6 percent since the beginning of the century.
Why the shift? One possibility is generational. The 1950s-1970s was the peak earning years of the Greatest Generation—those who lived through the Great Depression and a world war. Saving for a rainy day was instilled in them by their life experiences. As well, in the post-war years, employer provided pensions became the norm. With few current retirees relative to workers, pension funds balances experienced steady increases year after year.
The Baby Boom generation began to enter the workforce in the 1970s, just as the previous generation of workers began to retire. Thus savings in pensions and personal accounts by current workers was offset by retirees dipping into their retirement accounts to fund their consumption. Yet the savings was not adequate enough to replace the because of another generational factor. Baby Boomers—who never faced the hardships of their parents and grandparents—were not as inclined to feel the need to save.
A second possibility for the decline in the savings rate was the 1980s saw the rise of defined-contribution pension plans in the form of 401(k)s and IRAs. These plans allowed people to put aside tax deductible savings for retirement. BEA does not include gains in the value of stocks and bonds (including those in retirement funds) in the estimates of personal income until they are realized—e.g. when a person retires and dips into the account.
During the 1980s and 1990s, the rise in the stock market greatly inflated the boomers’ retirement accounts. The robust gains in these accounts meant workers could save less and still achieve their retirement goals. This theory is contradicted by the mediocre performance of the stock market since the peak of the dot-com bubble in the late 1990s. The markets have only just regained the values it achieved prior to the recession. One would expect a rise in savings to compensate for the lack of returns on investments.
The low savings rate is a problem because Baby Boomers are reaching retirement age and they are not being replaced by an equal cohort of young workers. It would be less of a problem if the federal government were making up for the dearth in savings by running surpluses. But that is not the case. It would be worth individuals and politicians took a page from the Greatest Generation and save more.