Politicians talk frequently about job creation. But what actually creates jobs is a subject of intense debate. Do we need more public spending? Less? Fewer regulations? Smarter regulations? The answer usually depends on the audience and ignores the deeper questions. What kind of jobs are we creating? Do other jobs get destroyed? Would high-skill immigrants take a job from an American or create a new one for him or herself?
A recent report, Technology Works: High-Tech Employment and Wages in the United States, from the Bay Area Council Economic Institute, a trade organization from an area that knows a thing or two about facilitating economic growth, sheds light on these questions by highlighting a tried and true method for creating jobs: attracting and employing technology workers. When a city, community, or region employs a technology worker, this engenders a multiplier effect on employment in the local economy. In fact, the Bay Area Council’s study finds that every one job in the high-tech sector—defined as those most closely related to science, technology, engineering, and math (STEM) fields—leads directly to 4.3 jobs in local goods and services industries across all
I expected that there would be a lot of critical responses to my argument that Emmanuel Saez had “cooked the numbers,” in his study of income inequality,to show that virtually all of income growth during the “recovery” after the Great Recession went to the wealthiest 1 percent. I had a strong feeling that most people would miss my narrowly framed argument and think that I was belittling the negative effects of inequality on our population. Despite attempts to inoculate myself from this criticism by showing the relatively low share of income held by the top one percent in 1979, various commentators have criticized me on several grounds: not discussing wealth inequality; not seeing the long rise in inequality; picking selective years to make my points; helping the right wing; overemphasizing the effect of transfers because of the rise of Social Security and Medicare and failing to appreciate the difficulties of middle class people and exaggerating the effects on the rich.
I have an odd intellectual history in that I was one of the first researchers to report on rising inequality in the late 1970s and 1980s, yet have for the
Senator and likely presidential hopeful Marco Rubio (R-FL) appeared on last Tuesday’s The Daily Show with Jon Stewart, promoting his new book and weathering an endless stream of jokes about his home state of Florida. While the discussion covered a range of policy ground, we wanted to highlight one comment by Senator Rubio that showed an all too common misunderstanding of innovation and automation.
Rubio said, “The concern I have about the minimum wage increase is that we have been told by the CBO and independent analysts that it will cost certain jobs. And that happens when some businesses will decide that well, you’ve now made our employees more expensive than machines so we’re going to automate. So in 5-10 years it’s going to happen anyway but this will accelerate this process, when you go to a fast food restaurant it will not be a person taking your order, there will be a touchscreen there that you will order from and when you get your order it will be right. [uneasy laughter] But the point is, if you make that person now more expensive than that new technology, they’re going
Ask any economist why some countries are poor and some countries are rich, and they will probably answer, “productivity”. Essentially, this means that people in rich countries are rich because they are able to create more wealth with less effort. But how do they do this? One of the primary ways is through better technology.
Unfortunately, instead of being recognized for its contribution to wealth, better technology is all too often demonized as a threat to employment, particularly in low-income countries without social safety nets. Intuitively, people care more about the jobs and income streams that already exist than the potential future savings from automating their jobs–a bird in hand, as they say. But a new paper by Mehmet Ugur and Arup Mitra of the University of Greenwich shows that even in very poor countries, technology is far less threatening than it may appear.
We have argued here before that robots are not taking our jobs: in the long run on a macro level productivity increases have no relationship with either the total number of people employed or with the level of unemployment. This is because when automation or
U.S. productivity growth is stagnating, and if the trend continues it could have a drastic impact on the U.S. economy. Without increasing productivity, the only way for a country to get richer is by working more or borrowing more. Furthermore, productivity is a crucial part of international competitiveness, because it is only by increasing our productivity that we can compete with other countries on cost.
A recent BLS news release does a good job of showing the worrying trends. Productivity growth has been abnormally low since approximately 2006, plummeting through the Great Recession, recovering slightly immediately afterward, and slowing considerably since 2010.
The first graph below (Chart 1) provides historical context back to 2000. There is a clear decline in labor productivity (the dark blue line) and also multifactor productivity (light blue). These are the two most common ways of understanding output growth: labor productivity estimates how much each worker produces and multifactor productivity tells us how much each worker and unit of capital can together invest.
Looking back a bit further in time, the next graph (Chart 2) estimates the amount that different factors contributed to total productivity growth.
In 2004, the Department of Veterans Affairs was forced to scrap a multimillion-dollar computer system that was designed to streamline the agency’s costs. Ironically, the project cost taxpayers $265 million, and is one of many examples of federal IT projects which go massively over budget and under deliver. Part of the reason for these failures is the last time we made significant changes to how our government acquired its own IT was the Clinger-Cohen Act of 1996. This law was enacted the year before Google.com was registered as a domain name, back when Windows 95 was the new big thing. Almost two decades later, while innovation has continued to press forward, our government’s ability to efficiently acquire new IT has lagged miserably behind.
Luckily, a few lawmakers are trying to remedy that. In March 2013, Congressmen Darrell Issa (R-CA) and Gerry Connelly (D-VA) introduced H.R. 1232, the Federal Information Technology Acquisition Reform Act (FITARA), to overhaul the federal government’s approach to acquiring IT. The bill seeks to designate clear responsibility and authority over federal IT investment, enhance the government’s ability to get good IT, strengthen the federal IT
Techno-utopianism seems to be a particularly American phenomena. As I argued in The Past and Future of America’s Economy it seems like about every half century – usually as it turns out right before a big structural slowdown of technological innovation – pundits and scholars start to go overboard on how great the techno-enabled future will be. Case in point was the 1967 book Year 2000 written by Herman Kahn, noted futurist and founder of the Hudson Institute. Kahn relied on the new “science” of forecasting and ended up with a book that had the tone of “you ain’t seen nothing yet.” He wrote:
This seems to be one of those quite common situations in which early in the innovation period many exaggerated claims are made, then there is disillusionment and swing to over conservative prediction and a general pessimism and skepticism, and then when a reasonable degree of development has been obtained and a learning period navigated, many – if not all – of the early ‘ridiculous’ exaggerations are greatly exceeded. It is particularly clear that if computers improve by five, ten or more orders of magnitude over the
In gloomy economic times such as these, we naturally look around for sources of blame. Former saviors make easy targets.
The tech boom of the late 1990s was great for the U.S. economy: GDP rose, unemployment dropped, and median incomes even made their most significant gains since the 1970s. Most people understood this success was due to new technology–and to information technology in particular–and they expected IT to be a main driver of the economy for years to come. Our bold New Economy had arrived, with all the convenience and style of America Online.
But the 2001 recession shook our faith in technology, and in the aftermath of the 2008 financial crisis many have turned on our would-be robotic saviors. Their disillusionment takes on the forms of disappointment, fear, or both.
The disappointed see our IT revolution, chock full of smartphones and big data, and ask, what good has it made in the real world? Recent technologies have changed our lives, certainly, but not with the productive power of previous advances. Instead: we order takeout via the internet instead of the phone; we watch YouTube at work in
Evidence of technological change is all around us—smartphones, self-driving cars, amazing drug discoveries, and even drone warfare. With all of this novelty many futurists and other pundits breathlessly proclaim that technological change is speeding up. In fact, some go so far as to claim that the pace of innovation is not only accelerating, it is accelerating exponentially (which, anyone with a rudimentary understanding of exponents can see, is utter nonsense). Peter Diamandis, author of Abundance: The Future Is Better Than You Think, argues that “Within a generation, we will be able to provide goods and services, once reserved for the wealthy few, to any and all who need them. Or desire them.”
But is the rate of technological change really getting faster? Other commentators, including some notable academic economists, actually think the opposite—that we have run out of the “easy” technological advances and new breakthroughs will take much more work.
Questions about the rate of technological change may seem trivial—will I get one hoverboard or two?—but getting a handle on an answer is critical because in economic terms, technological change equals economic growth. And growth has powerful implications for
After already slashing R&D funding, the Sequester is about to deliver another kick in the teeth to American competitiveness: it’s going to sharply reduce our ability to measure it. This one comes courtesy of the Bureau of Labor Statistics, which announced last month that the sequestration has forced it to eliminate its International Labor Comparisons (ILC) program, a neat little database that adjusts foreign data to a common framework, allowing you to compare the traded sector health and competiveness of the United States against that of other countries.
This may not sound like much, but in the nerdy world of competitive analysis economics, it’s huge. No one else provides this data to the same extent as ILC. The OECD does a bit,[i] but their data are rife with warnings about the perils of cross-country comparison among their indicators. Moreover, the OECD has little-to-no data on the big boys such as China and India, which renders its data useless for any “big picture” comparisons of our competitive health. Other organizations, such as the UN Industrial Development Organization, provide limited competitiveness data that is vastly incomplete.
In contrast, the ILC