In the 1980s, Japan was America’s chief rival in most technology industries. Not only could Japanese firms compete in advanced sectors against U.S. firms, they had an innovation advantage. In fact, research and design (R&D) investments in Japan were 40 percent more productive in producing IT patents than were R&D investments in the United States, implying that Japanese firms were better able to make advancements into developing better good, products, and processes.
However, in the 1990s this trend reversed. U.S. firms, while less innovative in hardware manufacturing, developed an innovation advantage in software, with R&D spending yielding 60 percent more patents per dollar spent in the United States than in Japan. A recent paper by Ashish Arora, Lee G. Branstette, and Matej Drev explains why.
Software represented a new frontier for an industry that had previously focused on producing hardware such as semiconductor, televisions, computers, and other advanced machinery. High-tech firms adjusted rapidly to the new challenge, and innovations quickly built on previous innovations, and an IT patent filed in 2002 was 10 times more likely to cite a software patent than one filed in 1992.
Advanced technology industries in … Read the rest
Ask any economist why some countries are poor and some countries are rich, and they will probably answer, “productivity”. Essentially, this means that people in rich countries are rich because they are able to create more wealth with less effort. But how do they do this? One of the primary ways is through better technology.
Unfortunately, instead of being recognized for its contribution to wealth, better technology is all too often demonized as a threat to employment, particularly in low-income countries without social safety nets. Intuitively, people care more about the jobs and income streams that already exist than the potential future savings from automating their jobs–a bird in hand, as they say. But a new paper by Mehmet Ugur and Arup Mitra of the University of Greenwich shows that even in very poor countries, technology is far less threatening than it may appear.
We have argued here before that robots are not taking our jobs: in the long run on a macro level productivity increases have no relationship with either the total number of people employed or with the level of unemployment. This is because when automation or … Read the rest
In 1956, an American engineer, William Shockley, had an idea that silicon could be used to make transistors, and founded a company in Mountain View, California. The rest is history. The area experienced explosive growth after the invention of the silicon semiconductor sparked waves of innovation. Other firms developed around the Shockley’s first company, also developing and improving on the invention. Continual support from nearby Stanford University, along with collaboration between local firms, created an innovative environment ideal for fostering growth. By the 1960s, 31 semiconductor firms had been established in the country, of which only five were located outside the region. Smaller firms providing research, specialized services, and other inputs located nearby the larger companies. Innovation thrived, the local economy boomed, the center of high-tech innovation shifted from the east coast to the west, and the Silicon Valley was born.
The Silicon Valley is a prime example of how advanced R&D tends to focus in clusters- geographically concentrated industries that maximize spillovers from firm to firm and between public and private researchers. Once research concentrates in an area, it is hard to displace, which is why DOE and other … Read the rest
U.S. productivity growth is stagnating, and if the trend continues it could have a drastic impact on the U.S. economy. Without increasing productivity, the only way for a country to get richer is by working more or borrowing more. Furthermore, productivity is a crucial part of international competitiveness, because it is only by increasing our productivity that we can compete with other countries on cost.
A recent BLS news release does a good job of showing the worrying trends. Productivity growth has been abnormally low since approximately 2006, plummeting through the Great Recession, recovering slightly immediately afterward, and slowing considerably since 2010.
The first graph below (Chart 1) provides historical context back to 2000. There is a clear decline in labor productivity (the dark blue line) and also multifactor productivity (light blue). These are the two most common ways of understanding output growth: labor productivity estimates how much each worker produces and multifactor productivity tells us how much each worker and unit of capital can together invest.
Looking back a bit further in time, the next graph (Chart 2) estimates the amount that different factors contributed to total productivity growth. … Read the rest
A recent NBER working paper offers up some interesting new survey data on innovation in U.S. manufacturing industries. Authors Ashish Arora, Wesley M. Cohen, and John P. Walsh surveyed more than 5000 U.S. manufacturing firms in 2010, asking whether or not they brought new products to market in the previous three years.
Most notably, the data shows that the number of truly innovative manufacturing firms is relatively small. In the aggregate, it finds that 43 percent of firms introduced new products in the past three years, but only 18 percent of firms introduced new products that were wholly new to their market. In other words, one quarter of firms, and more than half of firms introducing new products, introduced “imitation” products following the lead of other companies. The percent of firms introducing totally new products ranged significantly between industries, from just 10 percent of firms in the “Wood” and the “Metals” industries, to 44 percent in the “Instruments” industry.
The survey also breaks down the results in a number of interesting ways, including where the innovations originated. It finds that the most common source of innovation is customers. This is … Read the rest
Productivity is one of the most fundamental determinants of our income and overall wellbeing, so the question of where productivity growth comes from is extremely important. There are many different ways to increase productivity, but increases that have a continued impact over time are the most important because accumulated productivity increases end up having a much larger impact than one-off changes.
Economists have understood for years that R&D is an important source of productivity growth. However, it hasn’t been entirely clear whether R&D affects productivity growth in short, one-time boosts, or whether it raises growth rates for longer periods.
A new paper by Italian economists Antonio Minniti and Francesco Venturini looks at data from the U.S. manufacturing sector and concludes that R&D policies have indeed created “persistent, if not permanent” changes in the rate of productivity growth. It also drills down into the type of R&D spending, finding that only R&D tax credits have a long-term impact on the growth rate while R&D subsidies provide just a temporary boost.
These results are good news for both the economy and for policymakers because they show the powerful impact that innovation policies … Read the rest
For much of the postwar era the United States led the world in technology, which brought significant economic benefits to the nation. That leadership was due in large part to generous federal government funding for R&D, much of it channeled through military spending. That this occurred during the Cold War was no coincidence: as William Janeway argues in Doing Capitalism in the Innovation Economy, nations have historically been unable to muster the political will for significant spending on innovation without it being part of a “national mission,” since such spending means giving up current consumption for uncertain future benefits. In the last half of the 1800s, nation building provided the mission for America—just as that does now for China. But after the late 1940s the animating mission that helped drive technology innovation was winning the Cold War, which we did.
The threat from the Soviet Union meant that Americans were willing to sacrifice present consumption for the good of the nation–in this case keeping the world safe for freedom and democracy. And it meant we did what it took to win—and that meant innovating. The fact that Lockheed’s Skunk … Read the rest
The Washington Post printed a story about how the Campaign for a Commercial-Free Childhood has submitted the opinions of six experts on child development to the Federal Trade Commission in support of CCFC’s complaint against toy maker Fisher-Price for marketing its “Laugh and Learn” app for infants and small children.
One of the six experts, Herbert Ginsburg writes, “Existing research suggests that infants and very young children are not cognitively ready to learn key abstract ideas about numbers. Although some children at the upper bounds of this age range might learn to parrot some number words they are highly unlikely to learn important concepts of numbers.”
To be sure I am not a child development expert (although I did study child development in college.) I am a parent of a wonderful daughter. When she was 19 months old I ran across a Fisher Price online game, “The ABC Game“, which taught infants and toddlers their letters. (This was pre-tablet so I used a laptop). My daughter would press keys on my laptop and up would pop a picture of the letter, a picture of an animal whose first … Read the rest
Economist, venture capitalist, and co-founder of the Institute for New Economic Thinking Dr. William Janeway stopped by ITIF this week for a discussion about his new book, Doing Capitalism in an Innovation Economy. Dr. Janeway presented a compelling view of the economy and touched on a number of important issues along the way.
Janeway explained that the government plays a critical role in innovation by providing research funding through institutions such as DARPA and the NIH, by leveraging the buying power of the federal coffers, and by creating policies that encourage business investment in R&D. Economists have long understood that private markets fail to allocate adequate resources to innovation and research: the benefits are too hard for individual corporations to capture. For this reason, policies like the R&D tax credit and public investment in basic research have long been uncontroversial.
Contrary to what recent high-profile failures like Solyndra might lead people to believe, government policies to spur innovation in the United States have had great success. This is apparent in the vast amount of money the private sector has poured into IT and Biotech businesses based on initial … Read the rest
Obtaining a drug patent isn’t easy: it requires, on average, 14.6 years and $1.2 billion in pre-approval research and development and clinical testing. In addition, it also requires the developer to meet a set of three internationally accepted conditions. According to the World Trade Organization’s (WTO) Trade Related Aspects of Intellectual Property Rights (TRIPS) Agreement, in order to obtain a patent, a drug must:
- Be new,
- Involve an inventive step, and
- Be capable of industrial application.
TRIPS also clarifies that “involving an inventive step” and “being capable of industrial application” are synonymous with “non-obvious” and “useful”, respectively. For being a WTO legal document, it’s actually surprisingly clear: be new, be non-obvious and be useful.
Typically, the patent is issued prior to a drug’s clinical testing, primarily because if a commercially viable drug is developed from the clinical testing, it is vulnerable to theft and copying. In other words, patents are filed upon discovery of a chemical formula, as part of the United States Patent and Trademark Office’s “first to file” rule. Without the patent, innovative pharmaceutical companies would not have an incentive to research and develop this formula into … Read the rest