Innovation Files has moved! For ITIF's quick takes, quips, and commentary on the latest in tech policy, go to

Why the 2000s Were a Lost Decade for American Manufacturing

Originally posted on IndustryWeek.

In my inaugural article I tried to make the case for why Washington should care about manufacturing (Reason: it’s the key traded sector for the U.S. economy.)

But once one accepts the importance of manufacturing, the next question is how is the manufacturing sector doing? Is U.S. manufacturing healthy and not in need of a national manufacturing policy or is it in trouble and in need of smarter policies?

One key indicator to answer this question is change in the number of manufacturing jobs.

America lost 5.7 million, or 33%, of its manufacturing jobs in the 2000s. This is a rate of loss unprecedented in U.S. history—worse than in the 1980s, when BusinessWeekwarned of deindustrialization and worse than the rate of manufacturing job loss experienced during the Great Depression.

While U.S. manufacturing has clawed back, regaining about half a million of those lost manufacturing jobs since 2010, there’s little doubt that the 2000s constituted the worst decade for manufacturing employment in the Republic’s history.

Moreover, the recovery of manufacturing jobs is actually worse than in most prior recoveries.

In response to these statistics, official Washington has a ready-made excuse: these job losses are a sign of strength, not weakness, as they simply reflect superior manufacturing productivity performance.

For example, William Strauss, a senior economist at the Federal Reserve Bank of Chicago, holds that rapid productivity growth has been the core driver of manufacturing job losses and should in fact be seen as a key metric of manufacturing success.

This perspective is shared by both the right and the left. Robert Reich, president Clinton’s Labor Secretary, explains that, “The majority of manufacturing job losses is due to productivity increases,” while the National Review’s Kevin Williamson arguesthat “the fall in [manufacturing] employment in America and elsewhere should be seen as a good thing.” This has now become the accepted wisdom.

However, a closer analysis of the data reveals that this “wisdom” is fundamentally flawed.

If the unprecedented loss of manufacturing jobs was due to superior productivity, why did manufacturing productivity grow at analogous rates between 1990 and 1999 and between 2000 and 2009—56% and 61%, respectively—while manufacturing employment declined just 3% in the former decade, but 33% in the latter?

In addition, as William Nordhaus has noted, increases in the rate of manufacturing productivity were actually associated with increases in the rate of job growth during the period from 1948 to 2003. So the loss of so many manufacturing jobs in the last decade cannot automatically be attributed to superior productivity.

Nor was it a reflection of some natural and inexorable loss of manufacturing jobs as some assert. To say that manufacturing jobs have gradually shrunk over the last 30 years and therefore recent loses are a reflection of this trend misses the point.

There’s a critical difference between gradual decline in manufacturing jobs in the 1980s and 1990s and catastrophic loss in the 2000s.

The real reason the U.S. lost 5.7 million manufacturing jobs in the last decade was due to the decline in manufacturing output, which in turn was caused by U.S. manufacturing losing out in global competition.

During the 2000s, 13 of the 19 aggregate-level U.S. manufacturing sectors, which employed 55% of manufacturing workers in 2000, experienced absolute declines in real output.

For example, motor vehicle output decreased 45%, textiles 47%, and apparel 40%.  In other words, manufacturing establishments were producing less, and so of course they employed fewer workers.

But the real numbers are actually worse than the official government figures. The current assessment tools used to measure manufacturing output are skewed due to the massive overestimation of output of the computer and electronics industry (NAICS 334) in the 2000s.

According to official Bureau of Economic Analysis (BEA) data, real output in NAICS 334 was over 5.17 times higher in 2010 compared to 2000, an increase of 417 percent. If these numbers are accurate then close to 15% of total U.S. GDP growth in the 2000s came from this one sector, despite the fact that sector employment declined 43% and sector shipments decreased 25% during the same time period.

What’s going on? The principal reason for this disparity is the massive quality improvements seen in the computing sector during the 2000s. “Moore’s Law” enabled massive increases in computer power, performance and storage which the current BEA analytical model measures as growth in real output.

As a result, when measured properly, total manufacturing output actually fell by 11% during the prior decade (a period when measured GDP increased by 16%)—likely the only decade in American history (other than perhaps the Great Depression) when absolute U.S. manufacturing output fell.

As a result, ITIF estimates that over 60% of U.S. manufacturing job losses in the 2000s were due to competitiveness challenges, rather than productivity gains.

While this was occurring, and while our leaders could not agree on whether it was a problem, other nations such as China and India were greatly increasing market share in the same industrial sectors, through coordinated national efforts to expand innovation, productivity and exports.

And these nations are only increasing their efforts to further advance their economies at the expense of U.S. manufacturers and workers.

We can expect overall manufacturing output, and the jobs that are based on it, to continue to recede unless we address the real problems we face. Namely, how do we make American firms more globally competitive to increase output, production and real growth? But that will be the topic for another day.

Print Friendly, PDF & Email