The story

US Economic Recessions Since WWII—And How They Ended

US Economic Recessions Since WWII—And How They Ended



We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

A recession is defined as a contraction in economic growth lasting two quarters or more as measured by the gross domestic product (GDP). Starting with an eight-month slump in 1945, the U.S. economy has weathered 12 different recessions since World War II.

On average, America’s post-war recessions have lasted only 10 months, while periods of expansion have lasted 57 months. Some economists predict that the COVID-19 pandemic will put an end to the longest period of economic expansion on record, which ran 128 months—more than a decade—from mid-2009 to early 2020.

February to October 1945: End of WWII

World War II was an economic boon for the U.S. economy as the government infused tens of billions of dollars into manufacturing and other industries to meet wartime needs. But with the surrender of both Germany and Japan in 1945, military contracts were slashed and soldiers started coming home, competing with civilians for jobs.

As government spending dried up, the economy dipped into a serious recession with GDP contracting by a whopping 11 percent. But the manufacturing sector adapted to peacetime conditions faster than expected and the economy righted itself in a tidy eight months. At its worst, the unemployment rate was only 1.9 percent.

November 1948 to October 1949: Post-War Consumer Spending Slows

When wartime rations and restrictions were lifted after WWII, American consumers rushed to catch up on years of pent-up purchases. From 1945 to 1949, American households bought 20 million refrigerators, 21.4 million cars, and 5.5 million stoves.

When the consumer spending boom began to level off in 1948, it triggered a “mild” 11-month recession in which GDP shrunk by only 2 percent. Unemployment was up considerably, though, with all former GIs back in the job market. At its peak, unemployment reached 7.9 percent in October 1949.

July 1953 to May 1954: Post-Korean War Recession

This relatively short and mild recession followed the script of the post-WWII recession as heavy government military spending dried up after the end of the Korean War. During a 10-month contraction, GDP lost 2.2 percent and unemployment peaked around 6 percent.

The post-Korean War recession was exacerbated by the Federal Reserve’s monetary policy. As would happen in many future recessions, the Fed raised interest rates to combat high inflation caused by an influx of dollars into the wartime economy. The higher interest rates had the intended effect of slowing inflation, but also lowered confidence in the economy and undercut consumer demand.

In fact, one of the main reasons that the recession was so short was because the Fed decided to lower interest rates back down in 1953.

August 1957 to April 1958: Asian Flu Pandemic

In 1957, an Asian Flu pandemic spread from Hong Kong across India and into Europe and the United States, sickening untold numbers and ultimately killing more than a million people worldwide. The illness also triggered a global recession that cut U.S. exports by more than $4 billion.

Again, the economic problems were compounded by the Fed raising interest rates to slow inflation, which had been on the rise throughout the 1950s. Consumer spending flagged and the U.S. economy sunk into an eight-month recession during which GDP shrank by 3.3 percent and unemployment rose to 6.2 percent.

Dwight D. Eisenhower is credited with ending the short recession by boosting government spending on highway construction and other public infrastructure projects approved by the 1956 Federal Aid Highway Act.

April 1960 to February 1961: The Recession that Cost Nixon an Election

Just two years later, Richard M. Nixon was vice president when the nation sunk into yet another recession. Nixon blamed the economic slump for his loss to John F. Kennedy in the 1960 presidential election.

There were two major causes of this 10-month recession, during which GDP declined 2.4 percent and unemployment reached nearly 7 percent. The first was what economists call a “rolling adjustment” in several major industries, most notable automobiles. Consumers started buying more compact foreign cars and U.S. carmakers had to slash inventory and adjust to changing tastes, which meant a temporary reduction in profits.

The second cause was the Fed again, which raised interest rates fast on the heels of the previous recession in an ongoing effort to rein in inflation.

Not only did Nixon get the blame for starting the recession, but JFK took credit for ending it with a round of stimulus spending in 1961 and an expansion of Social Security and unemployment benefits.

December 1969 to November 1970: Putting the Brakes on 1960s Inflation:

This extremely mild recession was another course correction engineered by the Fed under the Nixon administration. After the previous recession, the U.S. economy went on a decade-long expansion that saw inflation rise to over 5 percent in 1969.

In response, the Fed once again raised interest rates, which had the intended consequence of cooling the hot 1960s economy while only reducing GDP by 0.8 percent over an 11-month recession. Unemployment rose to 5.5 percent over the same period. When the Fed lowered rates again in 1970, the economy cranked back into growth mode.

November 1973 to March 1975: The Oil Embargo

This recession marked the longest economic slump since the Great Depression and was caused by a perfect storm of bad economic news.

First, there was the Oil Embargo of 1973, imposed by the Organization of the Petroleum Exporting Countries (OPEC). With the oil supply restricted, gas prices soared and Americans cut spending elsewhere.

At the same time, Nixon tried to reduce inflation by instituting price and wage freezes in major U.S. industries. Unfortunately, companies were forced to lay off workers in order to afford the new salaries, which still weren’t high enough for consumers to pay the new fixed prices.

The result was “stagflation,” a stagnant economy with high inflation and low consumer demand, and a recession that spanned five consecutive negative-growth quarters. In all, the 16-month recession saw a 3.4 percent reduction in GDP and a near doubling of the unemployment rate to 8.8 percent.

The Fed had no choice but to lower interest rates to end the recession, but that set the stage for the truly runaway inflation of the late 1970s.

January to July 1980: Second Energy Crisis and Inflation Recession

Oil prices skyrocketed again in 1979 caused by disruptions to the oil supply during the Iranian Revolution and increased global oil demand. This led to high prices and long lines at the gas pump in the United States.

Meanwhile, inflation had grown to a staggering 13.5 percent and the Fed had no choice but to raise interest rates, which put the brakes on the booming late 1970s economy. The result was a tie for the shortest post-WWII recession—just six months start to finish—in which GDP declined only 1.1 percent but unemployment ratcheted up to 7.8 percent.

July 1981 to November 1982: Double Dip Recession

This far more painful recession came close on the heels of the short 1980 recession, introducing Americans to the phrase “double-dip recession.”

For the third time in a decade, one of the recessionary triggers was an oil crisis. The Iranian Revolution was over, but the new regime under Ayatollah Khomeini continued to export oil inconsistently and at lower levels, keeping gas prices high.

At the same time, the Fed’s timid interest rates hikes in 1980 weren’t enough to slow inflation, so Fed chief Paul Volcker pushed interest rates to new heights—21.5 percent in 1982. The sky-high rate pulled inflation down, but took its toll on the economy, which shrunk by 3.6 percent during the 16-month recession and saw unemployment peak at over 10 percent.

This long and deep recession finally ended following a combination of tax cuts and defense spending under Ronald Reagan.

July 1990 to March 1991: S&L Crisis and Gulf War Recession

A host of factors led to the economic slowdown of the early 1990s. One was the failure of thousands of Savings & Loan institutions in the late 1980s which hit the mortgage lending market particularly hard. Fewer mortgages meant record low levels new construction, which had far-reaching effects across the economy.

While that may have been enough to send the economy into recession, Saddam Hussein of Iraq invaded neighboring Kuwait, a major oil producer. The ensuing Gulf War caused oil prices to more than double. Adding to the economic woes was the October 1989 “mini-crash” of the stock market.

The result was an eight-month recession that saw GDP decline by 1.5 percent and unemployment peak at 6.8 percent. Even when the recession officially ended in 1991, it was followed by several quarters of very slow growth.

March to November 2001: The Dot-Com Crash and 9/11

Irrational exuberance is blamed for the stock market bubble that formed around internet startups in the late 1990s and 2000. Investors pumped money into unproven businesses, artificially inflating their values to unsustainable levels. When the dot-com bubble finally burst in 2001, the tech-heavy Nasdaq lost 75 percent of its value and hordes of investors went belly up.

While the tech sector took a devastating hit, the rest of the economy stumbled along until the September 11th terrorist attacks knocked it down for good. The early 2000s were also marked by high-profile corporate accounting scandals at Enron and poor stock market returns. The S&P 500 lost 43 percent of its value from 2000 to 2002.

Given how much the dot-com crash impacted a generation of investors, the 2001 recession was relatively fast and shallow, with GDP down only 0.3 percent overall and unemployment peaking at 5.5 percent.

The economy was able to pull out of the 2001 recession on the strength of the housing sector, which experienced growth even during the recession thanks to low interest rates.

December 2007 to June 2009: The Great Recession

WATCH: Here's What Caused the Great Recession

The longest and most calamitous economic downturn since the Great Depression, the Great Recession was part of a global financial meltdown triggered by the collapse of the U.S. housing bubble.

The Great Recession was the result of a financial house of cards built on the subprime mortgage market. Large financial institutions invested heavily in mortgage-backed securities. When homeowners defaulted on those high-risk mortgages, not only did they lose their homes, but huge investment banks like Bear Stearns and Lehman Brothers teetered on the verge of collapse.

The dual housing banking crises sent shockwaves through the stock market, and major indices like the S&P 500 and Dow Jones Industrial Average lost half of their value, gutting the retirement accounts of millions of Americans.

During the agonizing 18-month recession, unemployment reached as high as 10 percent and GDP shrunk by a whopping 4.3 percent. The economy only turned around after massive government stimulus spending (more than $1.5 trillion) to prop up the failing banks and inject capital into the shell-shocked economy.


V. A Brief History of Employment Trends in Recessions and Recoveries

Have the employment trends for men and women in the Great Recession and the subsequent recovery been different from those in past episodes? The answer is, during the recession, no, but during the recovery, yes.

Since 1969, six episodes of recession and recovery have occurred. The recessions preceding the Great Recession lasted from December 1969 to November 1970, November 1973 to March 1975, January 1980 to November 1982 (two episodes counted as one in this report), 9 July 1990 to March 1991, and March 2001 to November 2001. (The dates shown in the accompanying figure are slightly different because employment changes are computed from one month before the start of the recession to the last month of the recession.)

In all recessions since 1969, including the Great Recession, employment outcomes for men were worse than for women. However, the recovery from the Great Recession is the first since 1970 in which, two years after the recession ended, women have lost jobs while men have gained jobs. It is also the first recovery in which the unemployment rate for men has fallen even as the unemployment rate for women has risen.

Men have experienced job losses in all recessions since 1969. The most severe job losses for men occurred during the recessions from January 1980 to November 1982, when their employment fell 5.9%, and during the Great Recession, which caused a 7.6% decline in men’s employment.

Women did not always lose jobs in recessions—their employment increased during the 1969-70, 1973-75 and 1980-82 recessions. That is a consequence of the rapid movement of women from the home to the office in those decades. The labor force participation rate for women increased from 43.3% in 1970 to 51.5% percent in 1980, and then again to 57.5% by 1990. The rise in the number of women in the labor force was sufficient to mask the effects of the recessions on their employment level. As the labor force participation rate stabilized for women in the 1990s, they, too, experienced drops in employment during recessions.

Employment trends during the first two years of economic recoveries reveal rapid gains in jobs for both genders during the 1970s and the 1980s. In part, that is because many jobs lost in those recessions were a consequence of temporary layoffs, restored quickly as the economic climate improved. In more recent recessions, especially in the Great Recession, the role of temporary layoffs diminished and permanent reductions in workforces were the more significant force behind rising unemployment. Thus, recoveries from the 1991 recession and the recessions in this decade are associated with sluggish gains in employment. 10

Regardless of the rates at which jobs returned after recessions, job growth for women outpaced job growth for men in the recoveries in 1970-72, 1975-77, 1982-84 and 1991-93. Both men and women continued to lose jobs in the two years following the 2001 recession, but women lost them at a slower rate. It is only in the recovery from the Great Recession that women have lost jobs while men have gained them.

Trends in the unemployment rate reaffirm the unique nature of the recovery from the Great Recession. Historically, the unemployment rate for men has tended to increase more than for women during recessions.

During economic recoveries, the unemployment rate for men and women has not necessarily fallen, but it has mostly moved in the same direction. In the 1982-84 recovery, the unemployment rate for men decreased 4.0 percentage points and the rate for women fell 2.5 percentage points. In the 2001-03 recovery, the unemployment rate rose 0.5 percentage points for men and 0.3 percentage points for women. The current recovery is the first in which the unemployment rates for men and women have gone in opposite directions—falling for men (1.1 percentage points) but rising for women (0.2 percentage points).


The sharp decline in economic activity in February marked the end of the longest expansion in the U.S. since at least 1854, according to the National Bureau of Economic Research. Here are expansions compared with G.D.P since the end of WWII.

Notes: Data are quarterly changes in gross domestic product, seasonally adjusted at annual rates, and the duration of business cycle expansion in months.

Sources: Bureau of Economic Analysis National Bureau of Economic Research

WASHINGTON — The United States economy officially entered a recession in February 2020, the committee that calls downturns announced on Monday, bringing the longest expansion on record to an end as the coronavirus pandemic caused economic activity to slow sharply.

The economy hit its peak in February and has since fallen into a downturn, the National Bureau of Economic Research’s Business Cycle Dating Committee said. A recession begins when the economy reaches a peak of activity and ends when it reaches its trough.

This downturn is the first since 2009, when the last recession ended, and marks the end of the longest expansion — 128 months — in records dating back to 1854. Most economists expect this recession to be both particularly deep and exceptionally short, perhaps just a few months, as states reopen and economic activity resumes.

The National Bureau of Economic Research, a nonprofit group that tracks economic cycles in the United States, noted the unusual circumstances surrounding the slump in its announcement.

“The committee recognizes that the pandemic and the public health response have resulted in a downturn with different characteristics and dynamics than prior recessions,” the group said. “Nonetheless, it concluded that the unprecedented magnitude of the decline in employment and production, and its broad reach across the entire economy, warrants the designation of this episode as a recession, even if it turns out to be briefer than earlier contractions.”

Many economists believe the United States may already have exited the recession — or at least be on its way out.

Robert Gordon, a Northwestern University economist and a member of the dating committee, said that he would bet a recovery started in April or May, meaning that the recession would most likely last for only a couple of months. Even so, he said, labeling it a downturn was not a hard choice “because of the extraordinary depth.”

“There’s no way you can observe that happening and not call it a recession,” he said, while acknowledging that it was a very unusual one. “Nothing like it has ever happened.”

The National Bureau of Economic Research formally dates business cycles based on a range of economic markers, importantly gross domestic product and employment.

Economic activity in the United States began to contract sharply at the very end of February and into early March as the coronavirus spread across major metropolitan areas, like New York City, Chicago and Atlanta. Shops closed, travelers canceled flights and diners began avoiding restaurants, even before some states issued formal stay-at-home orders.

Real-time economic gauges, like a series on Chase credit card spending produced by J.P. Morgan, show that spending pulled back sharply in early March and has gradually rebounded since late April. Even so, spending remains well below pre-crisis levels.

The unemployment rate, a crucial gauge of economic health and an important input to business cycle dating, began to rise in March before jumping to 14.7 percent in April. It eased slightly to 13.3 percent in May, data released last week showed, but that is higher than the peak jobless rate in the Great Recession.

“We’ve already seen signs that the economy is past the trough and is in the recovery phase,” said Matthew Luzzetti, the chief U.S. economist at Deutsche Bank Securities. But there are differences between the overall level of output and the period-to-period change because the former is likely to remain depressed for some time, even as the latter bounces back.

Economists in a Bloomberg survey expect growth to contract by 9.7 percent in the second quarter compared with the same period last year, followed by a 6.8 percent contraction in the third quarter relative to the third quarter of 2019.

Looking at a commonly used annualized rate, which states the numbers so that they are easily comparable from period to period, growth is expected to contract by a 34 percent rate in the second quarter before bouncing back at a 15 percent pace in the third.

“It’s going to take longer to recover the level of activity, even though the growth rate is strong,” Mr. Luzzetti said.

The global economy as a whole will experience its deepest recession since World War II this year, according to a World Bank forecast released on Monday. Global output will shrink by 5.2 percent, the institution said, warning that while growth is likely to rebound in 2021, a more protracted pandemic that leads to a breakdown in financial markets and global trade could darken the outlook.


The Great Recession of 2007 significantly disrupted the U.S. economy and its banking sector. It is often compared to the Great Depression as an example of its severity, and with two years of negative year-over-year gross domestic product (GDP) growth (2007 and 2008), around 500 bank failures over a six-year period (2008&ndash13), and an unemployment rate reaching an annual average of 9.6 percent (2010), comparisons to the Great Depression seem appropriate (see the chart).

As a result of the recent crisis, both governmental and supervisory agencies have enacted new laws and regulations in an attempt to help mitigate risks of another crisis-like event. Additionally, banks&mdashespecially large banks with $50 billion or more in total assets&mdashmust now provide additional details on their various holdings and operations. Although the relationship among the economy, regulations, and bank data may seem novel and uneven, a long history actually exists among them.

The article explores the historical relationship between the economy, banking regulations, and bank data in order to highlight how one relates to the others. Additionally, as regulators increase their focus on data-driven supervision, this article also discusses the role of banking data in the supervisory process.

The timeline (below) shows key banking, economic, and regulatory events from 1782 to today. The light blue timeline represents events related to bank data and includes key bank reporting milestones along with banking data providers and examples of bank rating coverage by nationally recognized statistical ratings organizations (NRSROs). The light red timeline focuses on the U.S. economy and highlights periods of recession or depression, along with banking crises. The light brown timeline highlights key banking regulations, examinations, and industry events.

Download a high-resolution JPEG of this timeline.

A full examination of each event is beyond the scope of this article. However visualizing the various events using this timeline indicates the data tend to fall into roughly three sometimes overlapping periods that, for the purposes of this article, we'll call the Crises Era, the Regulatory Era, and the Bank Data Era.

The Crises Era (1782&ndash1930)

The history of the U.S. banking system arguably began with the very first chartered bank of the United States, the Bank of North America. Founded by the national government toward the end of the American Revolution, the Bank of North America opened in 1782 in Philadelphia and provided credit to the newly formed government. The creation of the bank came at a time of extreme economic uncertainty, however, as the war's disruption of commerce and the nation's increased debt load caused concerns about government's ability to meet its debt obligations.

The post&ndashRevolutionary War period was also marked with multiple economic recessions and banking crises. Attempts to establish a permanent central bank failed after the First, and then the Second, Bank of the United States charters were allowed to expire. This period was known as the Free Banking Era, a time of extremely loose banking regulations where a formal banking charter was unnecessary to establish a bank (banks also issued currency/notes typically secured by bonds issued by the state in which the bank was located).

During this time, bank data were very sparse. Beginning in the early 1800s, banks were required to provide a one-time report on certain balance sheet items. By 1832, Congress passed a resolution that allowed the Treasury to collect an annual state of condition from banks. Many states were already collecting this data on an annual basis as the states were majority stakeholders of the banks.

The Free Banking Era ended with the National Banking Act of 1864, which established the Office of the Comptroller of the Currency (OCC). With the OCC, bank reporting starts to evolve as national banks begin reporting assets, liabilities, and income on a quarterly basis. Additionally, the OCC begins to conduct regular exams of both national and state banks. Although the establishment of the OCC was a positive step in creating a more stable banking system, it occurred during the end of the Civil War (1861&ndash65), which presented sizable obstacles to achieving banking and economic security.

The banking panic of 1907 led to the creation of Federal Reserve in 1913 and the beginning of what we will call the Regulatory Era. In addition to its central bank duties, the Federal Reserve was charged with supervising state member banks. The combined requirements of the OCC and the Fed led to a more consistent reporting across banks of different charters. The Federal Reserve also created the Federal Reserve Bulletin which was an aggregated report of banking industry performance.

Finally, the last event of the Crisis Era&mdashand arguably one of the most severe events&mdashwas Black Thursday (1929), which led to the Great Depression. In all, there were 79 quarterly recessions between the years 1854 and 1932, representing almost 75 percent of all recorded U.S. recessions (see the chart). With so many recessions and banking crises, in addition to the fact that for a part of that time the United States was engaged in, or recovering from, wars on its own soil, the term &ldquoCrisis Era&rdquo seems particularly fitting.

The Regulatory Era (1913&ndash80)

Following the Federal Reserve Act, the next significant banking law was the Banking Act of 1933, also known as the Glass-Steagall Act. Glass-Steagall accomplished many things, including giving the Federal Reserve additional regulatory powers and prohibiting banks from engaging in investment banking activities. It also created the Federal Deposit Insurance Corporation (FDIC), which received authority to provide deposit insurance to commercial banks and supervisory oversight of state nonmember banks. In 1935, the Federal Credit Union Act, which established federal credit unions, and the Banking Act, which permanently established the FDIC as a governmental agency, were passed.

The next major set of banking laws was designed to address more consumer-oriented issues. The Truth in Lending Act (1968), for example, was established to help protect consumers from unfair credit practices. The 1970s saw passage of the Housing & Community Development Act, the Home Mortgage Disclosure Act (HMDA), and the Community Reinvestment Act (CRA), with all three laws aimed at expanding credit to potentially underserved communities.

During this time, there were many important developments in bank reporting that laid the groundwork for the growth in bank data to come. In 1959, the more modern version of the Call Report was established. (Machine-readable data also began in 1959.) In 1966, the FDIC began reporting its annual Summary of Deposits, which lists the location of a bank's branches along with branches' deposit levels. The FR Y-9C&mdashthe Call Report for bank holding companies&mdashbegan in 1978. In 1980, as a result of the passage of HMDA, home mortgage disclosure data is made available.

To be sure, some recessions occurred after the Great Depression and between 1950 and 1980, but they were generally less severe, and often of shorter duration, than those witnessed during the Crisis Era. Banking panics were also essentially eliminated, primarily as a result of FDIC deposit insurance, and bank failures overall during this time averaged only five a year.

The Bank Data Era (1980&ndashpresent)

The period roughly beginning in 1980 can be referred to as the Bank Data Era, and it's easy to see why. Although significant banking legislation has passed since 1980, this is a period marked by advances in bank data. Most of the light blue, unlabeled markers in the above timeline during this period represent major changes to the Call Report (typically of 100 items or more). From a bank analysis perspective, one of the more notable developments was the creation of the Uniform Bank Performance Report (UBPR) in 1984. Supervisors still commonly use the UBPR to help assess bank performance. In addition to providing a consistent and repeatable platform for analysis, another benefit of the UBPR is its public availability.

The Bank Data Era is also a time where the connection between regulatory and economic events, and bank reporting, becomes more obvious. Examples of these relationships include the establishment of Basel I in 1988 and the subsequent addition of risk-based capital items to the Call Report in 1990. The savings and loan crisis of the 1980s and early 1990s resulted in the second-largest number of bank failures since the Great Depression, and because of concerns over bank exposure to mortgage lending, in 1991 the Call Report was amended to include additional details on real estate lending. More recently, the Dodd-Frank Act of 2010 led to possibly one of the largest data collection efforts by the Federal Reserve through the collection of FR Y-14 Comprehensive Capital Analysis and Review (CCAR), FR Y-16 Dodd-Frank Act (DFA), and FR 2052 Liquidity Monitoring Report data.

Next Phase: The Data-Driven Era

As we have discussed, bank data have evolved alongside both the U.S. economy and banking system. What is also interesting is that the growth in banking data is similar to the growth associated with technology and global data overall. Given this growth and the increased use of technological tools and techniques, regulators have become increasingly focused on the inclusion of data-driven analysis in supervision.

This increased attention to the role of data in the supervisory process by the Federal Reserve is highlighted in two recent SR letters. SR 12-17, Consolidated Supervision Framework for Large Financial Institutions, lays out a framework for large institution supervision. Part B.4 describes using various bank data to identify risks to a firm and overall systemic risk analysis. More recently, and directly, in SR 15-16, Enhancements to the Federal Reserve System&rsquos Surveillance Program, the Federal Reserve provides a brief description of its surveillance program, which uses forward-looking methodologies to help assess bank risk.

Examples of data-driven models

So what does a data-driven supervisory or bank risk model look like? The answer to that varies depending on what risk, or risks, are being evaluated. As described in SR 15-16, the Fed's SR-SABR model is a logistic regression that predicts the probability of an overall downgrade in a CAMELS (Capital, Asset Quality, Management, Earnings, Liquidity) rating. It also produces a probability of firm failure through its Viability score. Other examples of bank health models that are available in the marketplace include credit ratings from firms like S&P, Moody's, and Fitch, and other CAMELS-type assessment models from IDC, Veribanc, BauerFinancial, and Kroll.

Along with an assessment of the overall health of a bank, data-driven models can be focused on a particular area of banking risk. One such set of risk measures is the OCC's Canary Report, which seeks to identify risks related specifically to credit risk, interest rate risk, and liquidity risk. Concerning interest rate risk, just as bank management implement various interest rate risk models to manage market and repricing risk, supervisors use similar models to help with their assessment of these risks. Typically based on a bank's Call Report data, supervisors have used internally developed Earnings at Risk (EAR) and Economic Value of Equity (EVE) models for years. The growth in and availability of loan, deposit, and interest rate data have improved the performance of these models by providing model developers with better details on rate-sensitive product volumes and pricing. Additionally, improved computing power has enabled the creation of more econometric-based models that can generate a prediction of net interest margin performance based on interest rates, asset yields, and deposit costs.

The next frontier of data-driven bank analysis will certainly use big data technologies and analytics. For example, the Office of Inspector General for the Board of Governors of the Federal Reserve System and the Consumer Financial Protection Bureau hosts the Risk Assessment, Data Analysis and Research (RADAR) data warehouse, which stores a wide array of data on various credit-related products. The Kansas City Fed has the Center for the Advancement of Data and Research in Economics, which hosts the High-Performance Computing cluster. Together with the current systems, all these data set up a foundation for big data analysis, and tools such as SAS, R, and Python are starting to help researchers mine that data for bank-specific risks and industry trends.

Summing it up

Bank data have evolved from being a one-time reporting of bank capital to being intricately integrated into bank supervision and economic analysis. The type and purpose of bank data have also historically reflected the state of the banking system, regulatory environment, and economy. If this trend continues, as we believe it will, then the tools and techniques of data-driven bank analysis will start to more closely mirror those being used for other big data analysis and will hopefully be able to mitigate risks&mdashboth at the bank level and systemically&mdashand provide useful insight into how the banking system works.

Dean Anderson
Senior technical expert in the Atlanta Fed's supervision and regulation division


Recommended Reading

How the Great War Shaped the World

A Eulogy for the Free Press

The Fight Over Canada’s Founding Prime Minister

The United States might claim a broader democracy than those that prevailed in Europe. On the other hand, European states mobilized their populations with an efficiency that dazzled some Americans (notably Theodore Roosevelt) and appalled others (notably Wilson). The magazine founded by pro-war intellectuals in 1914, The New Republic, took its title precisely because its editors regarded the existing American republic as anything but the hope of tomorrow.

Yet as World War I entered its third year—and the first year of Tooze’s story—the balance of power was visibly tilting from Europe to America. The belligerents could no longer sustain the costs of offensive war. Cut off from world trade, Germany hunkered into a defensive siege, concentrating its attacks on weak enemies like Romania. The Western allies, and especially Britain, outfitted their forces by placing larger and larger war orders with the United States. In 1916, Britain bought more than a quarter of the engines for its new air fleet, more than half of its shell casings, more than two-thirds of its grain, and nearly all of its oil from foreign suppliers, with the United States heading the list. Britain and France paid for these purchases by floating larger and larger bond issues to American buyers—denominated in dollars, not pounds or francs. “By the end of 1916, American investors had wagered two billion dollars on an Entente victory,” computes Tooze (relative to America’s estimated GDP of $50 billion in 1916, the equivalent of $560 billion in today’s money).

That staggering quantity of Allied purchases called forth something like a war mobilization in the United States. American factories switched from civilian to military production American farmers planted food and fiber to feed and clothe the combatants of Europe. But unlike in 1940-41, the decision to commit so much to one side’s victory in a European war was not a political decision by the U.S. government. Quite the contrary: President Wilson wished to stay out of the war entirely. He famously preferred a “peace without victory.” The trouble was that by 1916, the U.S. commitment to Britain and France had grown—to borrow a phrase from the future—too big to fail.

Tooze’s portrait of Woodrow Wilson is one of the most arresting novelties of his book. His Wilson is no dreamy idealist. The president’s animating idea was an American exceptionalism of a now-familiar but then-startling kind. His Republican opponents—men like Theodore Roosevelt, Henry Cabot Lodge, and Elihu Root—wished to see America take its place among the powers of the earth. They wanted a navy, an army, a central bank, and all the other instrumentalities of power possessed by Britain, France, and Germany. These political rivals are commonly derided as “isolationists” because they mistrusted the Wilson’s League of Nations project. That’s a big mistake. They doubted the League because they feared it would encroach on American sovereignty. It was Wilson who wished to remain aloof from the Entente, who feared that too close an association with Britain and France would limit American options. This aloofness enraged Theodore Roosevelt, who complained that the Wilson-led United States was “sitting idle, uttering cheap platitudes, and picking up [European] trade, whilst they had poured out their blood like water in support of ideals in which, with all their hearts and souls, they believe.” Wilson was guided by a different vision: Rather than join the struggle of imperial rivalries, the United States could use its emerging power to suppress those rivalries altogether. Wilson was the first American statesman to perceive that the United States had grown, in Tooze’s words, into “a power unlike any other. It had emerged, quite suddenly, as a novel kind of ‘super-state,’ exercising a veto over the financial and security concerns of the other major states of the world.”

Wilson hoped to deploy this emerging super-power to enforce an enduring peace. His own mistakes and those of his successors doomed the project, setting in motion the disastrous events that would lead to the Great Depression, the rise of fascism, and a second and even more awful world war.

What went wrong? “When all is said and done,” Tooze writes, “the answer must be sought in the failure of the United States to cooperate with the efforts of the French, British, Germans and the Japanese [leaders of the early 1920s] to stabilize a viable world economy and to establish new institutions of collective security. … Given the violence they had already experienced and the risk of even greater future devastation, France, Germany, Japan, and Britain could all see this. But what was no less obvious was that only the US could anchor such a new order.” And that was what Americans of the 1920s and 1930s declined to do—because doing so implied too much change at home for them: “At the hub of the rapidly evolving, American-centered world system there was a polity wedded to a conservative vision of its own future.”

President Woodrow Wilson (far right) stands with other leaders of the Council of Four at the Paris Peace conference in 1919. (Wikipedia)

Periodically, attempts have been made to rehabilitate the American leaders of the 1920s. The most recent version, James Grant’s The Forgotten Depression, 1921: The Crash That Cured Itself, was released just two days before The Deluge: Grant, an influential financial journalist and historian, holds views so old-fashioned that they have become almost retro-hip again. He believes in thrift, balanced budgets, and the gold standard he abhors government debt and Keynesian economics. The Forgotten Depression is a polemic embedded within a narrative, an argument against the Obama stimulus joined to an account of the depression of 1920-21.

As Grant correctly observes, that depression was one of the sharpest and most painful in American history. Total industrial production may have dropped by 30 percent. Unemployment spiked at perhaps close to 12 percent (accurate joblessness statistics don’t exist for this period). Overall, prices plummeted at the steepest rate ever recorded—steeper than in 1929-33. Then, after 18 months of extremely hard times, the economy lurched into recovery. By 1923, the U.S. had returned to full employment.

Grant presents this story as a laissez-faire triumph. Wartime inflation was halted. Borrowing and spending gave way to saving and investing. Recovery then occurred naturally, without any need for government stimulus. “The hero of my narrative is the price mechanism, Adam Smith’s invisible hand,” he notes. “In a market economy, prices coordinate human effort. They channel investment, saving and work. High prices encourage production but discourage consumption low prices do the opposite. The depression of 1920-21 was marked by plunging prices, the malignity we call deflation. But prices and wages fell only so far. They stopped falling when they become low enough to entice consumers into shopping, investors into committing capital and employers into hiring. Through the agency of falling prices and wages, the American economy righted itself.” Reader, draw your own comparisons!

Grant’s argument is not new. The libertarian economist Murray Rothbard argued a similar case in his 1963 book, America’s Great Depression. The Rothbardian story of the “good” depression of 1920 has resurfaced from time to time in the years since, most spectacularly when Fox News star Glenn Beck seized upon it as proof that the Obama stimulus was wrong and dangerous. Grant tells the story with more verve and wit than most, and with a better eye for incident and character. But the central assumption of his version of events is the same one captured in Rothbard’s title half a century ago: that America’s economic history constitutes a story unto itself.

America's "forgotten depression" through the lens of Dow Jones industrial averages from 1918 to 1923 (Wikipedia)

Widen the view, however, and the “forgotten depression” takes on a broader meaning as one of the most ominous milestones on the world’s way to the Second World War. After World War II, Europe recovered largely as a result of American aid the nation that had suffered least from the war contributed most to reconstruction. But after World War I, the money flowed the other way.

Take the case of France, which suffered more in material terms than any World War I belligerent except Belgium. Northeastern France, the country’s most industrialized region in 1914, had been ravaged by war and German occupation. Millions of men in their prime were dead or crippled. On top of everything, the country was deeply in debt, owing billions to the United States and billions more to Britain. France had been a lender during the conflict too, but most of its credits had been extended to Russia, which repudiated all its foreign debts after the Revolution of 1917. The French solution was to exact reparations from Germany.

Britain was willing to relax its demands on France. But it owed the United States even more than France did. Unless it collected from France—and from Italy and all the other smaller combatants as well—it could not hope to pay its American debts.

Americans, meanwhile, were preoccupied with the problem of German recovery. How could Germany achieve political stability if it had to pay so much to France and Belgium? The Americans pressed the French to relent when it came to Germany, but insisted that their own claims be paid in full by both France and Britain.

Germany, for its part, could only pay if it could export, and especially to the world’s biggest and richest consumer market, the United States. The depression of 1920 killed those export hopes. Most immediately, the economic crisis sliced American consumer demand precisely when Europe needed it most. True, World War I was not nearly as positive an experience for working Americans as World War II would be between 1914 and 1918, for example, wages lagged behind prices. Still, millions of Americans had bought billions of dollars of small-denomination Liberty bonds. They had accumulated savings that could have been spent on imported products. Instead, many used their savings for food, rent, and mortgage interest during the hard times of 1920-21.

But the gravest harm done by the depression to postwar recovery lasted long past 1921. To appreciate that, you have to understand the reasons why U.S. monetary authorities plunged the country into depression in 1920.

Grant rightly points out that wars are usually followed by economic downturns. Such a downturn occurred in late 1918-early 1919. “Within four weeks of the … Armistice, the [U.S.] War Department had canceled $2.5 billion of its then outstanding $6 billion in contracts for perspective, $2.5 billion represented 3.3 percent of the 1918 gross national product,” he observes. Even this understates the shock, because it counts only Army contracts, not Navy ones. The postwar recession checked wartime inflation, and by March 1919, the U.S. economy was growing again.

As the economy revived, workers scrambled for wage increases to offset the price inflation they’d experienced during the war. Monetary authorities, worried that inflation would revive and accelerate, made the fateful decision to slam the credit brakes, hard. Unlike the 1918 recession, that of 1920 was deliberately engineered. There was nothing invisible about it. Nor did the depression “cure itself.” U.S. officials cut interest rates and relaxed credit, and the economy predictably recovered—just as it did after the similarly inflation-crushing recessions of 1974-75 and 1981-82.

But 1920-21 was an inflation-stopper with a difference. In post-World War II America, anti-inflationists have been content to stop prices from rising. In 1920-21, monetary authorities actually sought to drive prices back to their pre-war levels. They did not wholly succeed, but they succeeded well enough. One price especially concerned them: In 1913, a dollar bought a little less than one-twentieth of an ounce of gold by 1922, it comfortably did so again.

James Grant hails this accomplishment. Adam Tooze forces us to reckon with its consequences for the rest of the planet.

Every other World War I belligerent had quit the gold standard at the beginning of the war. As part of their war finance, they accepted that their currency would depreciate against gold. The currencies of the losers depreciated much more than the winners among the winners, the currency of Italy depreciated more than that of France, and France more than that of Britain. Yet even the mighty pound lost almost one-fourth of its value against gold. At the end of the conflict, every national government had to decide whether to return to the gold standard and, if so, at what rate.

The American depression of 1920 made that decision all the more difficult. The war had vaulted the United States to a new status as the world’s leading creditor, the world’s largest owner of gold, and, by extension, the effective custodian of the international gold standard. When the U.S. opted for massive deflation, it thrust upon every country that wished to return to the gold standard (and what respectable country would not?) an agonizing dilemma. Return to gold at 1913 values, and you would have to match U.S. deflation with an even steeper deflation of your own, accepting increased unemployment along the way. Alternatively, you could re-peg your currency to gold at a diminished rate. But that amounted to an admission that your money had permanently lost value—and that your own people, who had trusted their government with loans in local money, would receive a weaker return on their bonds than American creditors who had lent in dollars.

Britain chose the former course pretty much everybody else chose the latter.

The consequences of these choices fill much of the second half of The Deluge. For Europeans, they were uniformly grim, and worse. But one important effect ultimately rebounded on Americans. America’s determination to restore a dollar “as good as gold” not only imposed terrible hardship on war-ravaged Europe, it also threatened to flood American markets with low-cost European imports. The flip side of the Lost Generation enjoying cheap European travel with their strong dollars was German steelmakers and shipyards underpricing their American competitors with weak marks.

Such a situation also prevailed after World War II, when the U.S. acquiesced in the undervaluation of the Deutsche mark and yen to aid German and Japanese recovery. But American leaders of the 1920s weren’t willing to accept this outcome. In 1921 and 1923, they raised tariffs, terminating a brief experiment with freer trade undertaken after the election of 1912. The world owed the United States billions of dollars, but the world was going to have to find another way of earning that money than selling goods to the United States.

That way was found: more debt, especially more German debt. The 1923 hyper-inflation that wiped out Germany’s savers also tidied up the country’s balance sheet. Post-inflation Germany looked like a very creditworthy borrower. Between 1924 and 1930, world financial flows could be simplified into a daisy chain of debt. Germans borrowed from Americans, and used the proceeds to pay reparations to the Belgians and French. The French and Belgians, in turn, repaid war debts to the British and Americans. The British then used their French and Italian debt payments to repay the United States, who set the whole crazy contraption in motion again. Everybody could see the system was crazy. Only the United States could fix it. It never did.

Peter Heather, the great British historian of Late Antiquity, explains human catastrophes with a saying of his father’s, a mining engineer: “If man accumulates enough combustible material, God will provide the spark.” So it happened in 1929. The Deluge that had inundated the rest of the developed world roared back upon the United States.

The Great Depression overturned parliamentary governments throughout Europe and the Americas. Yet the dictatorships that replaced them were not, as Tooze emphasizes in The Wages of Destruction, reactionary absolutisms of the kind re-established in Europe after Napoleon. These dictators aspired to be modernizers, and none more so than Adolf Hitler.

From left to right, Britain's Neville Chamberlain, France's Édouard Daladier, Germany's Adolf Hitler, and Italy's Benito Mussolini and Count Ciano prepare to sign the Munich Agreement in 1938. (Wikipedia)

“The United States has the Earth, and Germany wants it.” Thus might Hitler’s war aims have been summed up by a latter-day Woodrow Wilson. From the start, the United States was Hitler’s ultimate target. “In seeking to explain the urgency of Hitler’s aggression, historians have underestimated his acute awareness of the threat posed to Germany, along with the rest of the European powers, by the emergence of the United States as the dominant global superpower,” Tooze writes. “The originality of National Socialism was that, rather than meekly accepting a place for Germany within a global economic order dominated by the affluent English-speaking countries, Hitler sought to mobilize the pent-up frustrations of his population to mount an epic challenge to this order.” Of course, Hitler was not engaged in rational calculation. He could not accept subordination to the United States because, according to his lurid paranoia, “this would result in enslavement to the world Jewish conspiracy, and ultimately race death.” He dreamed of conquering Poland, Ukraine, and Russia as a means of gaining the resources to match those of the United States. The vast landscape in between Berlin and Moscow would become Germany’s equivalent of the American west, filled with German homesteaders living comfortably on land and labor appropriated from conquered peoples—a nightmare parody of the American experience with which to challenge American power.

Could this vision have ever been realized? Tooze argues in The Wages of Destruction that Germany had already missed its chance. “In 1870, at the time of German national unification, the population of the United States and Germany was roughly equal and the total output of America, despite its enormous abundance of land and resources, was only one-third larger than that of Germany,” he writes. “Just before the outbreak of World War I the American economy had expanded to roughly twice the size of that of Imperial Germany. By 1943, before the aerial bombardment had hit top gear, total American output was almost four times that of the Third Reich.”

Germany was a weaker and poorer country in 1939 than it had been in 1914. Compared with Britain, let alone the United States, it lacked the basic elements of modernity: There were just 486,000 automobiles in Germany in 1932, and one-quarter of all Germans still worked as farmers as of 1925. Yet this backward land, with an income per capita comparable to contemporary “South Africa, Iran and Tunisia,” wagered on a second world war even more audacious than the first.

The reckless desperation of Hitler’s war provides context for the horrific crimes of his regime. Hitler’s empire could not feed itself, so his invasion plan for the Soviet Union contemplated the death by starvation of 20 to 30 million Soviet urban dwellers after the invaders stole all foodstuffs for their own use. Germany lacked workers, so it plundered the labor of its conquered peoples. By 1944, foreigners constituted 20 percent of the German workforce and 33 percent of armaments workers (less than 9 percent of the population of today’s liberal and multicultural Germany is foreign-born). On paper, the Nazi empire of 1942 represented a substantial economic bloc. But pillage and slavery are not workable bases for an industrial economy. Under German rule, the output of conquered Europe collapsed. The Hitlerian vision of a united German-led Eurasia equaling the Anglo-American bloc proved a crazed and genocidal fantasy.

Tooze’s story ends where our modern era starts: with the advent of a new European order—liberal, democratic, and under American protection. Yet nothing lasts forever. The foundation of this order was America’s rise to unique economic predominance a century ago. That predominance is now coming to an end as China does what the Soviet Union and Imperial Germany never could: rise toward economic parity with the United States. That parity has not, in fact, yet arrived, and the most realistic measures suggest that the moment of parity won’t arrive until the later 2020s. Perhaps some unforeseen disruption in the Chinese economy—or some unexpected acceleration of American prosperity—will postpone the moment even further. But it is coming, and when it does, the fundamental basis of world-power politics over the past 100 years will have been removed. Just how big and dangerous a change that will be is the deepest theme of Adam Tooze's profound and brilliant grand narrative.


9 of the last 10 US recessions began with a GOP President. Why would anyone trust a Republican with the economy again?

House Minority Leader Kevin McCarthy is not a fan of history, apparently. At least not when history belies the myths Republicans stick in their constituents’ mouths like lightly drugged binkies whenever they want to pretend our country’s economy wasn’t saved by a black man who actually understands trivial matters like economics, foreign policy, diplomacy, and not dribbling a painter’s palette worth of McNugget sauces down the front of one’s torso.

Of course, Republicans are particularly bad at grasping recent economic history—so much so that they’ve almost completely forgotten who was president between Bill Clinton and Barack Obama. The name is on the tip of their tongues, but when the Republican National Convention comes along every four years, they somehow can’t seem to get it out.

That’s why they need to be reminded of their incompetence at every turn, and tweets like this are a great start:

Honestly, this pervasive, lingering myth that Republicans are great for the economy and Democrats are poison never ceases to baffle me.

The opposite is true. On every important measure — GDP growth, job creation, deficit spending, business investment growth — Democrats beat Republicans’ brains out, and they have for decades.

This only makes sense, of course. Democrats want to invest broadly in our economy, whereas Republicans love to balloon the deficit and hand fistfuls of cash to obscenely wealthy plutocrats just because.

So what would a “socialist Democrat” do? Probably invest in infrastructure and a forward-looking green economy unshackle financially strapped workers who are burdened with crushing student loan debt free would-be entrepreneurs who are scared to leave their jobs because they can’t lose their insurance put more money in the hands of poor and middle class workers, who would be more apt to spend it and protect our air, water, and natural resources, thus ensuring a sustainable future economy.

What won’t they do? Take credit for the previous president’s economy, like McCarthy just did. Why? There’s still a very good chance we’ll be in a recession by January 2021, because you could replace Donald Trump’s brain with a 9-volt battery plugged into a pimento loaf and very few people would be able to tell the difference.

Also, on economic policy, Trump has pretty faithfully followed the script of our last Republican president—whoever that might be—and that dude left the economy in a smoldering Yucatan crater before skipping off to paint cat pictures or some shit.

Which begs the question: Given all of this thoroughly documented historical data, why would anyone trust a Republican with the economy ever again?


Recessions and Unemployment Rate Trends

Figure 2 depicts a series of computations that result in a view of the alignment between recessions and the unemployment rate. This view of the alignment (panel D) highlights the intuition that frequent recessions, separated by short expansions, are associated with upward drift in the unemployment rate, while infrequent recessions, separated by long expansions, are associated with downward drift.

Note: Gray bars indicate recession periods.
Sources: US Bureau of Labor Statistics, Unemployment Rate [UNRATE], retrieved from FRED, Federal Reserve Bank of St. Louis (https://fred.stlouisfed.org/series/UNRATE), NBER-based Recession Indicators for the United States from the Period following the Peak through the Trough [USREC], retrieved from FRED, Federal Reserve Bank of St. Louis (https://fred.stlouisfed.org/series/USREC), and author&rsquos calculations.

Panel A of figure 2 shows the cumulative sum of the National Bureau of Economic Research&rsquos (NBER&rsquos) recession months from January 1948 to October 2020. I define a recession as starting in the month following the NBER peak and ending in the month of an NBER trough. For the current period, the NBER announced a business cycle peak in February 2020 but has not announced a subsequent trough. In figure 2, I treat March and April 2020 as recession months. 10

In panel B of figure 2, I fit a linear time trend to the cumulative sum of the NBER recession months with ordinary least squares. This time trend gives an estimate of how quickly recessions accumulate on average. Then in panel C, I remove the linear time trend from the cumulative sum and show a detrended cumulative sum of NBER recession months. This detrended cumulative sum shows when recessions have accumulated more quickly and less quickly than average.

The detrended cumulative sum in panel C rises at a constant rate in every recessionary month and falls at a constant but slower rate in every expansionary month. This structure implies that this variable may not fall to its previous low point if a recession cuts an expansion short. As a result, frequent recessions, separated by short expansions, can cause this detrended cumulative sum to drift up over time. This upward drift occurs with the four recessions that begin in 1948, 1953, 1957, and 1960 and again with the four recessions that begin in 1970, 1973, 1980, and 1981. In other words, both 1948 to 1960 and 1970 to 1982 are 13-year periods where recessions accumulated more quickly than average. In contrast, recessions accumulated less quickly than average during the long expansions that occur mostly since 1983 and also in the 1960s. During these periods, the detrended cumulative sum falls below its low point from previous expansions, creating downward drifts in the series.

The periods of rapid recession accumulation, 1948 to 1960 and 1970 to 1982, are also periods when the unemployment rate trend rises in figure 1. In contrast, periods when recessions accumulate less quickly than average, the 1960s, 1983 to 2000, and the 2010s, are all periods when the unemployment rate trend falls in figure 1. To make this comparison between the accumulation of recessionary months and the unemployment rate more explicit, panel D shows the detrended cumulative sum of NBER recession months (left axis) along with the unemployment rate (right axis). The two series move closely together and have a correlation of about 0.7, even including the unusually large spike in the unemployment rate in April 2020.

A positive correlation between the frequency of recessionary months and the unemployment rate is not surprising. The NBER&rsquos Business Cycle Dating Committee uses labor market variables when assigning business cycle peaks and troughs. 11 However, what is surprising about panel D is how closely the unemployment rate follows the detrended cumulative sum of recessionary months for such a long time&mdashfrom 1948 to 2020. 12 This is surprising because the US labor market has been driven by a variety of economic shocks along with changing government policies, labor market regulations, and demographics yet, the unemployment rate closely tracks the stable and linear structure of the detrended cumulative sum of recessionary months. As with the detrended cumulative sum of recessionary months, the unemployment rate rises quickly in recessions but falls slowly in expansions, and these features cause the unemployment rate to trend up with frequent recessions and trend down with infrequent recessions.


The Modern Cycle Of Economic Boom And Bust

The charge about the old days of the American economy—the nineteenth century, the “Gilded Age,” the era of the “robber barons”—was that it was always beset by a cycle of boom and bust. Whatever nice runs of expansion and opportunity that did come, they always seemed to be coupled with a pretty cataclysmic depression right around the corner. Boom and bust, boom and bust—this was the necessary pattern of the American economy in its primitive state.

Look at that Great Moderation. US Annualized Real GDP Growth between 1950 and 2010, showing the . [+] period of the Great Moderation. (Photo credit: Wikipedia)

In the modern era, all this was smoothed out. There were busts, above all the Great Depression, but these represented the last gasp of the old order. Since the rise of the governmental sector as a major component of the economy at the time of President Franklin D. Roosevelt, federal institutions, ranging from the income tax to the spending authority to the Federal Reserve, could ensure, through “counter-cyclical” policy, that the natural tendency of capitalism toward boom and bust could be smoothed out.

Such was the prevailing view for so long among the paladins of economics and economic policy. If there was one figure who made his living reiterating this theme, it was Yale economist Arthur Okun, who was Chairman of the Council of Economic Advisers under President Lyndon B. Johnson. “Okun’s law” (a relationship between growth and unemployment) remains a mainstay in the research shops of the Fed and the White House, and other establishmentarian places, to this day.

Yet with the Great Recession by no means put to pasture five years in—GDP growth for this year was just revised down last week to an anemic 1.8% —we have to start wondering if it is rather the modern era that is the one afflicted with an unholy cycle, of boom and bust.

The initial thinking in this direction came from Stanford economist John B. Taylor, who for some years now has been pointing out that since World War II, one period was notably more good and stable, from a macroeconomic perspective, than the others. This was the generation preceding the Great Recession, 1982-2007, a span which on the lead of Prof. Taylor’s work is often referred to as the “Great Moderation.”

What put the great in the great moderation was not necessarily the rate of growth. Growth was just fine in this era—3.3% per year in real terms, exactly the rate of growth that prevailed in the long post-World War II run that is so famous, 1945-73. What was great in the quarter century after 1982 was the sparseness of the recessions. When they came, you barely noticed them. There were only two, 1990-91 and 2000-01, the latter not even counting as a recession by the old rule of needing two consecutive quarters of negative growth.

After 1945, recessions not only were more frequent they were steeper. There were recessions in 1945, 1949, 1953, 1957, and 1960 (1957 having a deeper two-quarter decline than seen in our Great Recession), before finally a nine-year run came sans recessions.

But then stagflation came, with double-dip recessions in 1969-70, 1973-75, and 1980-82. Against conventional metrics, it still remains questionable to say that the Great Recession was worse than what hit in 1973-75 and 1980-82. The president seems bent on this interpretation, given his penchant for saying ours is the worst economic crisis since the Great Depression. The only thing we can be sure of is that the recovery since 2009 has been worse than the recoveries from every one of the various recessions listed above.

The ready lesson to be taken from the Great Moderation is that the single greatest era of good, stable growth in the modern era occurred in an environment where the governmental sector was if not in retreat, then in a posture of self-criticism and restraint. The Federal Reserve was obliging rule-based behavior that probably included making the price of gold a primary determinant in monetary policy. Tax rates went down in major fashion in 1981 and 1986, with the increases of the 1990s clawing back only a small bit. Spending peaked in 1983 and fell remorselessly through the Bill Clinton presidency.

In its latter stages, the Great Moderation (that of the George W. Bush presidency) saw some further tax cuts, but a complete reversal on spending. The level of federal spending in 2000 (18% of GDP) turned out to be a trough, going up first two more points under W. and then by a further 25% under Obama. This is not to mention a Fed that has cottoned to experimentation like never before.

Whatever you want to say about the nineteenth century, you have to look far and wide for Great Depressions, serial recessions, stagflations, and Great Recessions. The Great Moderation must have been too close for us to realize its unique excellence. For the reality just might be that in an epoch of history when the government thinks it can run the economy, extended underperformance will be the rule, and the continued march of prosperity the exception.


After the fastest recession in U.S. history, the economic recovery may be fizzling

United Airlines announced plans to lay off more than one-third of its 95,000 workers. Brooks Brothers, which first opened for business in 1818, filed for bankruptcy. And Bed Bath and Beyond said it will close 200 stores.

If there were still hopes of a “V-shaped” comeback from the novel coronavirus shutdown, this past week should have put an end to them. The pandemic shock, which economists once assumed would be only a temporary business interruption, appears instead to be settling into a traditional, self-perpetuating recession.

When states and cities began closing most businesses in March, the idea was to smother the virus and buy time for the medical system to adapt. Jared Kushner, the president’s son-in-law and a senior White House adviser, spoke of hopes “that by July the country’s really rocking again.”

But without a uniform federal strategy, many governors rushed to reopen their economies before bringing the virus under control. Now states such as Florida, California, Texas and Arizona are setting daily records for coronavirus cases and more than 70 percent of the country has either paused or reversed reopening plans, according to Goldman Sachs.

After two surprisingly strong months, the economy could begin shedding jobs again this month and in August, Morgan Stanley warned Friday. Many small businesses that received forgivable government loans have exhausted their funds while some larger companies are starting to thin their payrolls in preparation for a longer-than-expected downturn.

Fresh labor market weakness would represent a profound disappointment for millions of American workers and President Trump, who is eager to highlight economic progress with only a few months remaining before the November election.

“ ‘Stalling’ is the word I’m using,” said Jim O’Sullivan, chief U.S. macro strategist for TD Securities. “But the risk there is that the numbers start turning negative again.”

Several regional Federal Reserve officials last week expressed concerns about the recovery petering out. Raphael Bostic, president of the Federal Reserve Bank of Atlanta, warned that economic activity “is starting to level off.” Thomas Barkin, who heads the Richmond Fed, cited “air pockets” in new business orders.

At the White House on Friday, however, the president insisted that his plans were on track.

“I created the greatest economy we’ve ever had. And now we’re creating it again,” he said before leaving for Florida.

A day earlier, he told a group of Hispanic leaders he had launched “the fastest economic comeback in history.”

The economy did regain a total of 7.5 million jobs in May and June, faster than many economists anticipated. But that was just one-third of the number lost to the pandemic.

In a worrisome sign, more than two months after states like Georgia lifted their shelter-in-place orders, layoffs are spreading beyond companies that provide services requiring direct human contact. As disruption from the pandemic lingers, this could mean that the job loss is starting to feed on itself in a classic recessionary spiral, economists said.

Harley-Davidson last week said it was eliminating 700 jobs as part of a restructuring plan it described as unrelated to fallout from the pandemic. In April, the company said: “The crisis has provided an opportunity to reevaluate every aspect of our business and strategic plan. We have determined that we need to make significant changes to the company."

Indeed, the pandemic, which marked an abrupt end to a nearly 11-year expansion, is prompting companies to rethink their operations and to trim fat that accumulated while the economy was growing. Wells Fargo, the nation’s fourth-largest bank, is drawing up plans to cut “tens of thousands” of jobs later this year, according to a Bloomberg News report.

“While the timing seems to coincide with the stalling of the economic reopening process in over 30 states, there may well be something more strategic in play — that is, pressure on a growing segment of corporate America to ‘right size’ for what increasingly looks like a longer road back to full economic recovery,” said Mohamed El-Erian, chief economic adviser at Allianz, via email.

The unemployment rate never reached the Depression-caliber level of 20 percent that many economists had feared in March, topping out so far at 14.7 percent. But layoffs that initially were described as a temporary response to a health crisis are hardening into something more permanent, leaving millions of workers scrambling to regain their footing in a changed economy.


Recession of 1957

In August 1957, the country slipped into a recession that would increase unemployment by 7 percent and reduce corporate profits by 25 percent by April 1958. One of the reasons the President had promoted the Interstate System was just such a situation-that he would have a public works program that could be expanded or contracted to control the economy.

The recession of 1958 helped Democrats win a sweeping victory in the congressional elections, increasing their number in the Senate from 49 to 65. Lyndon Johnson quickly discovered that a large majority would be harder to keep unified than a narrow one. Younger liberal senators were challenging his leadership.

During the decade of the 1950's the ghost of the '29 crash remained as both an economic and political presence that tested the limitations of fiscal orthodoxy. In that period, unemployment increased more rapidly than the total increase in the level of employment. In 1947, with employment at 60,168,000, the proportion of unemployed was 3.9% of the work force. While employment increased in 1960 to 64,520,000, unemployment had risen to 5.7%. The experiences in the recessions of 1952-1953 and 1957-1958 pointed to serious defects in the American economy. In both cases, despite general recovery measured by increases in the Gross National Product, personal income, factory production and manufacturing orders, unemployment failed to decline to pre-recession levels. A further indication of the problem was the increase in the duration of unemployment. In 1947, 7% of the unemployed remained out of work 27 weeks or more. By 1960, 11% were unemployed 27 weeks or more.

The phenomenon of prices continuing to rise during a recession represented a sharp break with the past. But it set the pattern for the future. In the three recessions since the one of 1957-58, prices continued to rise (though at a diminished rate). Thus, today there is no longer any expectation that a recession will bring inflation to a complete halt. But in 1959 the failure of recession to end inflation was regarded as a novel and ominous development. In the first 5 months of 1958 a great deal of attention and debate, both within the administration of President Eisenhower and in the Congress, was devoted to the possibility of an anti-recessionary tax cut. Although the unemployment rate remained high, the recession ended officially in April 1958, and a measure proposed by Senator Paul H. Douglas in June to reduce individual income taxes was defeated 65-23.

Tight monetary policy contributed to the weakness of the 1959 recovery. Despite open-market purchases in 1959, the increase in the System's holding of securities was not enough to balance the gold outflow. In addition, the discount rate was raised and interest rates followed. A tight policy continued into 1960.

The trough of the recession had been reached in April 1958. Although System holdings of securities rose throughout the year, shortand medium-term interest rates rose very sharply between July and September, and, in the latter month the discount rate was raised. The fact that open-market purchases continued during 1959, as evidenced by the 2.7 percent increase in System holdings of securities, might lead one to question whether a tight money policy was being followed. But declines in the Treasury's gold stock were continuing due to the balance of payments deficit.

The recession in 1957-1958 revived interest in area redevelopment legislation for both political and economic purposes. At that time, Douglas gained important new support from Republican Senator Payne of Maine. The Douglas-Payne bill differed little from Douglas' original measure, and in 1958 it passed both the House and Senate, with impressive bipartisan support.

Eisenhower was not convinced of the desirability of this legislation, and vetoed the bill. He objected to those features that served to limit local responsibility and to increase unwarranted government expenditures. He specifically opposed the 100% grant for public facilities, the loosely-drawn criteria for eligibility, the inclusion of rural districts, the inclusion of long-term loans, the high loan limit, and the low interest rates.32 Eisenhower and his economic advisors were not unsympathetic to the hardships of depressed areas and the country's need for economic growth. But they believed that breaking the rules of community responsibility and fiscal conservatism was too great a price to pay. Moreover, the dominant thinking in the Administration emphasized aggregate rather than structural considerations.

Despite the presidential veto, legislators continued to introduce bills dealing with depressed areas. After the 1958 election, partisan lines had solidified to the point where a compromise bill introduced by Senator Hugh Scott (D-PA) and another Administration bill failed to make any headway. Without the support of Payne, whom Edmund Muskie had unseated in Maine, Douglas reintroduced his bill with minor changes. Despite political wranglings, the bill passed both Houses, and in 1960 reached the President's desk. In spite of the exhortations from Cabinet members, including Secretary Mitchell and Vice President Richard Nixon, Eisenhower again vetoed the bill. The climax of the saga of area redevelopment legislation awaited the outcome of the 1960 presidential election.

Although it was later determined that the peak of the business cycle was reached in August 1957, and in hindsight, action to fight a recession was called for, the Federal Reserve raised the discount rate one-half percentage point in that nonth to 3.5 percent. Four months later the error was apparent and the discount rate was lowered to 3 percent. By April 1958 four more reductions brought the rate down to 1.75 percent. In this same period, required reserve ratios at member banks in New York and Chicago were reduced in four steps, from 0.20 to 0.18.

A deliberately tight fiscal policy slowed the growth of expenditures in 1959 and 1960, contributing to the weak recovery frm the 1957-58 recession. Government purchases of goods and services actually declined because the Eisenhower Administration and many in Congress wanted a large actual budget surplus.

Their desire, however, was not prompted by any simplistic notions that deficits are always bad and surpluses always good (and the larger the better). Rather, four cmplex factors came into play. First, inflation was feared and this fear intensified because, for the first time, prices had continued to rise during the just-past recession. Second, economic growth as a national objective received much interest and many felt that a Government budget surplus facilitates that growth by making resources available for capital formation.

Third, the deficits in the nation's international balance of payments that had been occurring almost continuously since World War II became, for the first time, a matter of rather widespread concern and many believed that a surplus in Federal budget would, in ways that were never specified, alleviate the problem. Fourth, many held that fiscal parameters should be set so as to produce deficits in recession and large surpluses as recovery continued.

Because the recovery from the 1957-58 downturn was very sluggish (a new recession began in April 1960), actual revenues did not grow sufficiently to produce the hoped-for large surpluses. The increases in the full employment surplus between 1958 and 1960, and the huge surplus of $12.8 billion that would have existed in 1960 had the unemployment rate been around 4 percent.


Watch the video: Ιαπωνία: Η συγγνώμη του Ακιχίτο για τον Β Παγκόσμιο Πόλεμο (August 2022).