All posts by Daniel Castro
ITIF is counting down the days until the launch of a new report “How Tech Populism is Undermining Innovation” that discusses how recent policy debates on technology issues, including the fights over net neutrality and SOPA, have been dominated by heated and overblown populist rhetoric, rather than fact-based policy analysis to advance the public interest. ITIF argues that an “us vs. them” populism has taken over technological debates in recent years and has had a deleterious effect on policymaking.
To help those who might know someone suffering from tech populism (or themselves might be a victim), for the next two weeks ITIF will release a helpful hint each day on how to identify the symptoms of this terrible affliction.
Monday, March 16, 2015
Tuesday, March 17, 2015
Wednesday, March 18, 2015
Thursday, March 19, 2015
Friday, March 20, 2015
Monday, March 23, 2015
Tuesday, March 24, 2015
Wednesday, March 25, 2015
Thursday, March 26, 2015
Friday, March 27, 2015
Earlier this year, the National Telecommunications and Information Administration (NTIA) in the U.S. Department of Commerce announced its intention to relinquish oversight of key technical functions of the Internet. Towards this end, NTIA asked the Internet Corporation for Assigned Names and Numbers (ICANN) to convene global stakeholders to develop a proposal to take over the current role played by NTIA in the coordination of the Internet’s domain name system (DNS). This process is currently underway.
As the Information Technology and Innovation Foundation (ITIF) told Congress in testimony earlier this year, the transition away from U.S. oversight creates unique risks and challenges for Internet governance, many of which we may not be able to anticipate today. Without the current oversight provided by the United States, ICANN will not be accountable to anyone and will only be motivated by the interests of those individuals who control the organization. This makes it incumbent on the NTIA, the ICANN leadership, and global Internet stakeholders to insist that a comprehensive set of principles for the responsible management of Internet resources be firmly embedded within ICANN before the transition is allowed to be completed.
In 1895, Lord Kelvin, the renowned physicist, declared “Heavier than air flying machines are impossible” and dismissed those who were pursuing such research. Had the scientific community heeded his words and those of other skeptics, the advancements in aviation that define our modern world would have sadly been held back. Yet, unfortunately some in the scientific community have not learned the lesson that betting against human ingenuity is a fool’s game. The most recent example of this comes from Arvind Narayanan and Ed Felten who in a recent paper declared that de-identification has never and will never work. (Their paper was intended as a rebuttal to a piece written by Dr. Ann Cavoukian, the former Ontario Privacy Commissioner, and me, which demonstrated that the claims made in the popular press about academic research on re-identification methods often overstate the findings or omit important details.)
The authors are making an incredible claim. They are not saying that de-identification sometimes fails (which is painfully obvious to even the casual observer), but rather that there is no such thing as anonymous data. Narayanan and Felten write, “there is no evidence that de-identification works
The importance of data to the U.S. economy continues to grow. For example, in the United States the economic value from health care data is estimated at $300 billion annually, while $90 billion is generated from global positioning system (GPS) data, and $10 billion from weather data. And these examples just scratch the surface of the potential for data to transform a wide range of sectors including energy, education, finance, health care, manufacturing, and transportation.
Unfortunately, while President Obama has signed an historic executive order on open data and various government agencies have begun to promote data-driven innovation within their communities, such as the successful Health Datapalooza, there is still no federal government agency responsible for developing and implementing a national strategy to promote data-driven innovation across all sectors of the economy. To help fill this void, the Department of Commerce should establish an Office of Data Innovation.
The Office of Data Innovation would be responsible for facilitating data sharing between organizations and reducing barriers to global information flows. It would evaluate the impact of data-related regulations on competition and innovation in different industries, work with other
Some technological changes sneak up on us so quietly we do not even know it has happened. A perfect example of this is video programming. It was not too long ago when consumers had to drive to a store to rent a movie at the local Blockbuster rather than just start streaming a movie instantly with a few clicks on Netflix. Today consumers have more options than ever for legally obtaining video content. The market shows an unprecedented amount of competition as businesses experiment with different business models and technologies to deliver consumers video content. Both ISPs and over-the-top providers deliver video on a variety of formats including traditional programming, on-demand, and “on the go” options. In fact, there are so many options—Netflix, Hulu, Amazon, iTunes, Google Play, YouTube, Vudu, HBO Go, Dish Online, Crackle, etc.—that consumers have more choice today in video programming than ever before.
These changes are not only occurring in the United States, but also globally. Worldwide there are hundreds of legitimate streaming services that consumers can access. And consumers are accessing this content in new ways. Whereas we used to measure the percent of
Yesterday the United States Court of Appeals for the Ninth Circuit ruled that a lawsuit against Google for illegal wiretapping could proceed.
The case involves Google’s Street View project which provides online access to panoramic views of public streets in cities around the world. To build the database of images, Google sent vehicles into cities to photograph public streets. At times, these vehicles also unintentionally recorded data that users were transmitting over unencrypted wireless networks. The central claim of the lawsuit is that this collection of unencrypted data from wireless networks is a violation of the Wiretap Act. Google argued that the case should be dismissed because the Wiretap Act exempts “electronic communications” that are “readily accessible to the general public.” In its ruling, the Court denied Google’s motion to dismiss.
The basic logic of the Wiretap Act is that if people do not take action to make their communications private, then they do not have an expectation of privacy. For example, if two individuals use unscrambled CB radios to have a conversation, then other radio users are not in violation of the Wiretap Act if they hear this conversation.
In the world of tech where buzzwords come and go faster than you can say “synergy”, the “Internet of Things” is a bit of an oddity. It is an old concept (first coined in 1999) used to describe a futuristic world where everyday objects—from toasters to dog collars to running shoes—can communicate electronically with other devices. But whereas many buzzwords die off after a few years of overuse, there has been a surge in interest in the Internet of Things of late for the simple fact that the vision is quickly becoming reality.
The emergence of a host of popular consumer products such as FitBit (a personal exercise tracking device), Nest (a smart home thermostat), and Withings “Smart Body Analyzer” (a smart scale that also tracks air quality, heart rate, and body composition) has shown that it is both feasible and useful to embed intelligence in everyday devices. Nest, for example, combines sensors and user feedback to learn the heating and cooling preferences of its users, monitor energy use and environmental conditions over time, and even optimize energy consumption based on price signals from the energy
In an op-ed in last Friday’s Washington Post, FTC Commissioner Julie Brill, bemoaned the data-driven economy, equating the data scientists in Silicon Valley with the spooks at Fort Meade.
Unfortunately, she is not the first to do so. Since the exposure of the government’s PRISM program, veteran privacy activists have been conflating the intelligence community’s questionable, closed-door electronic surveillance program with the voluntary, open, and legitimate collection of personal data by the private sector. Chris Hoofnagle at the Berkeley Center for Law and Technology states, “What’s happening now is the logical outcome of a leave-it-to-the-market public policy agenda, which left the private sector’s hands unbound to collect data for the government.” And John Podesta at the Center for American Progress argues that after Edward Snowden’s revelations, the government “should not only examine NSA surveillance activities and the laws governing them, but also private-sector activities and telecommunications technology more generally.” Some critics have even gone so far as to blame innovation and technology. Writing in Salon, Andrew Leonard placed the blame directly on the technology: “By making it economically feasible to extract meaning from the massive streams of
Hadoop has been the industry standard for scalable data processing applications for several years, so why does a search for “Hadoop” on USAJOBS.gov return zero results?
One reason could be that given the current budget environment, hiring for IT projects might be suspended. The budget is certainly a factor, although it cannot be the only one as jobs for SQL, Java, and even COBOL developers can still be found.
Another reason might be that the federal government is simply contracting out this work. Again, this might explain part of the situation, but if so, it reflects poor planning by government agencies as these skills will be increasingly critical to the federal government given the massive amount of information it collects, stores and processes, and agencies should be cultivating this talent.
A more likely reason is that government agencies have not fully embraced “big data” because government leaders still do not fully understand what it can do or how it can help them operate more efficiently. For example, text mining can be applied to financial fraud detection, research paper classification, student sentiment analysis and smarter search engines for all
Last month’s international G8 summit produced a declaration with new guidelines for a broad range of policy issues. Included in this declaration was a set of recommendations for open data initiatives, known as the Open Data Charter. The charter represents the first time open data principles have been agreed to in an international forum—not to mention possibly the highest-level declaration of any kind to mention the open source code repository website GitHub—and will likely help shape the future role of government in data. Here are the key facts.
The Group of Eight is a policy forum for the governments of eight of the world’s largest economies (previously with six and seven member states) held annually since 1975. Although the summit will be gradually supplanted by the larger G20, which includes developing economies and non-Western states, G8 remains a bellwether of international policy. This year’s event was held June 17-18 at the Lough Erne Resort in County Fermanagh, Northern Ireland, and focused on tax policy as well as the ongoing Syrian civil war.
U.K. Prime Minister David Cameron played host to President Barack Obama, German