During the 2000s, globalization took millions of jobs from the United States. Some have been quick to associate this job loss with the technology that ostensibly made it possible, chiefly the adoption of ICT that allowed for global connectivity. So, would the United States have been better off if it had simply never invested in ICT in the first place?
There are those who would love to somehow put the technology introduced by the ICT revolution back in the box. But a new study shows that doing so would have detrimental impact on the economy. Yes, in some cases ICT investment introduced the tools which allowed companies to outsource jobs. But, as new paper, Does ICT Investment Spur or Hamper Offshoring?, finds, the same ICT investment enabled productivity gains that kept companies at home.
Of course, it is difficult empirically to determine whether ICT investments increase the likeliness of offshoring, as causality is difficult to determine. To address this problem, authors Luigi Benfratello, Tiziano Razzolini, and Alessandro Sembenelli examined small and medium-sized Italian manufacturing firms with varying access to local broadband facilities, a random variable that was used
In recent years what was once seen as crackpot economics has now become close to conventional wisdom: the notion that productivity costs jobs. Economists call this the lump of labor fallacy. As ITIF has written here, here, and here, it’s clear that the jobs problem of today has nothing to do with productivity and that we should not worry about productivity reducing the number of jobs.
But that has not stopped many talking heads and experts from opining that yes indeed, productivity kills jobs. One graph that has gotten and continues to get widespread attention is from Andrew McAfee and Erik Brynjolfsson’s book, The Second Machine Age, that shows that “productivity and employment have become decoupled.” [i]
But as any first-year statistics course will teach you, correlation does not prove causation. In fact, it is easy to get spurious correlations. Here’s one: The divorce rate in Maine is almost perfectly correlated with the per capita consumption of margarine.
In Brynjolfsson’s case, the relationship being examined merely shows two variables that happen to be increasing from 1970 to 2000, but there is no feasible underlying argument about how
Politicians talk frequently about job creation. But what actually creates jobs is a subject of intense debate. Do we need more public spending? Less? Fewer regulations? Smarter regulations? The answer usually depends on the audience and ignores the deeper questions. What kind of jobs are we creating? Do other jobs get destroyed? Would high-skill immigrants take a job from an American or create a new one for him or herself?
A recent report, Technology Works: High-Tech Employment and Wages in the United States, from the Bay Area Council Economic Institute, a trade organization from an area that knows a thing or two about facilitating economic growth, sheds light on these questions by highlighting a tried and true method for creating jobs: attracting and employing technology workers. When a city, community, or region employs a technology worker, this engenders a multiplier effect on employment in the local economy. In fact, the Bay Area Council’s study finds that every one job in the high-tech sector—defined as those most closely related to science, technology, engineering, and math (STEM) fields—leads directly to 4.3 jobs in local goods and services industries across all
I expected that there would be a lot of critical responses to my argument that Emmanuel Saez had “cooked the numbers,” in his study of income inequality,to show that virtually all of income growth during the “recovery” after the Great Recession went to the wealthiest 1 percent. I had a strong feeling that most people would miss my narrowly framed argument and think that I was belittling the negative effects of inequality on our population. Despite attempts to inoculate myself from this criticism by showing the relatively low share of income held by the top one percent in 1979, various commentators have criticized me on several grounds: not discussing wealth inequality; not seeing the long rise in inequality; picking selective years to make my points; helping the right wing; overemphasizing the effect of transfers because of the rise of Social Security and Medicare and failing to appreciate the difficulties of middle class people and exaggerating the effects on the rich.
I have an odd intellectual history in that I was one of the first researchers to report on rising inequality in the late 1970s and 1980s, yet have for the
Senator and likely presidential hopeful Marco Rubio (R-FL) appeared on last Tuesday’s The Daily Show with Jon Stewart, promoting his new book and weathering an endless stream of jokes about his home state of Florida. While the discussion covered a range of policy ground, we wanted to highlight one comment by Senator Rubio that showed an all too common misunderstanding of innovation and automation.
Rubio said, “The concern I have about the minimum wage increase is that we have been told by the CBO and independent analysts that it will cost certain jobs. And that happens when some businesses will decide that well, you’ve now made our employees more expensive than machines so we’re going to automate. So in 5-10 years it’s going to happen anyway but this will accelerate this process, when you go to a fast food restaurant it will not be a person taking your order, there will be a touchscreen there that you will order from and when you get your order it will be right. [uneasy laughter] But the point is, if you make that person now more expensive than that new technology, they’re going
Ask any economist why some countries are poor and some countries are rich, and they will probably answer, “productivity”. Essentially, this means that people in rich countries are rich because they are able to create more wealth with less effort. But how do they do this? One of the primary ways is through better technology.
Unfortunately, instead of being recognized for its contribution to wealth, better technology is all too often demonized as a threat to employment, particularly in low-income countries without social safety nets. Intuitively, people care more about the jobs and income streams that already exist than the potential future savings from automating their jobs–a bird in hand, as they say. But a new paper by Mehmet Ugur and Arup Mitra of the University of Greenwich shows that even in very poor countries, technology is far less threatening than it may appear.
We have argued here before that robots are not taking our jobs: in the long run on a macro level productivity increases have no relationship with either the total number of people employed or with the level of unemployment. This is because when automation or
U.S. productivity growth is stagnating, and if the trend continues it could have a drastic impact on the U.S. economy. Without increasing productivity, the only way for a country to get richer is by working more or borrowing more. Furthermore, productivity is a crucial part of international competitiveness, because it is only by increasing our productivity that we can compete with other countries on cost.
A recent BLS news release does a good job of showing the worrying trends. Productivity growth has been abnormally low since approximately 2006, plummeting through the Great Recession, recovering slightly immediately afterward, and slowing considerably since 2010.
The first graph below (Chart 1) provides historical context back to 2000. There is a clear decline in labor productivity (the dark blue line) and also multifactor productivity (light blue). These are the two most common ways of understanding output growth: labor productivity estimates how much each worker produces and multifactor productivity tells us how much each worker and unit of capital can together invest.
Looking back a bit further in time, the next graph (Chart 2) estimates the amount that different factors contributed to total productivity growth.
In 2004, the Department of Veterans Affairs was forced to scrap a multimillion-dollar computer system that was designed to streamline the agency’s costs. Ironically, the project cost taxpayers $265 million, and is one of many examples of federal IT projects which go massively over budget and under deliver. Part of the reason for these failures is the last time we made significant changes to how our government acquired its own IT was the Clinger-Cohen Act of 1996. This law was enacted the year before Google.com was registered as a domain name, back when Windows 95 was the new big thing. Almost two decades later, while innovation has continued to press forward, our government’s ability to efficiently acquire new IT has lagged miserably behind.
Luckily, a few lawmakers are trying to remedy that. In March 2013, Congressmen Darrell Issa (R-CA) and Gerry Connelly (D-VA) introduced H.R. 1232, the Federal Information Technology Acquisition Reform Act (FITARA), to overhaul the federal government’s approach to acquiring IT. The bill seeks to designate clear responsibility and authority over federal IT investment, enhance the government’s ability to get good IT, strengthen the federal IT
Techno-utopianism seems to be a particularly American phenomena. As I argued in The Past and Future of America’s Economy it seems like about every half century – usually as it turns out right before a big structural slowdown of technological innovation – pundits and scholars start to go overboard on how great the techno-enabled future will be. Case in point was the 1967 book Year 2000 written by Herman Kahn, noted futurist and founder of the Hudson Institute. Kahn relied on the new “science” of forecasting and ended up with a book that had the tone of “you ain’t seen nothing yet.” He wrote:
This seems to be one of those quite common situations in which early in the innovation period many exaggerated claims are made, then there is disillusionment and swing to over conservative prediction and a general pessimism and skepticism, and then when a reasonable degree of development has been obtained and a learning period navigated, many – if not all – of the early ‘ridiculous’ exaggerations are greatly exceeded. It is particularly clear that if computers improve by five, ten or more orders of magnitude over the
In gloomy economic times such as these, we naturally look around for sources of blame. Former saviors make easy targets.
The tech boom of the late 1990s was great for the U.S. economy: GDP rose, unemployment dropped, and median incomes even made their most significant gains since the 1970s. Most people understood this success was due to new technology–and to information technology in particular–and they expected IT to be a main driver of the economy for years to come. Our bold New Economy had arrived, with all the convenience and style of America Online.
But the 2001 recession shook our faith in technology, and in the aftermath of the 2008 financial crisis many have turned on our would-be robotic saviors. Their disillusionment takes on the forms of disappointment, fear, or both.
The disappointed see our IT revolution, chock full of smartphones and big data, and ask, what good has it made in the real world? Recent technologies have changed our lives, certainly, but not with the productive power of previous advances. Instead: we order takeout via the internet instead of the phone; we watch YouTube at work in