The FTC Report on Consumer Privacy Misses the Mark

cover of FTC report on privacy

This week the FTC released its much anticipated report on consumer privacy, “Protecting Consumer Privacy in an Era of Rapid Change”. The report is an update to the preliminary staff report released in December 2010 which laid out the FTC’s proposed framework for privacy. In the new report, the FTC lays out a set of principles for consumer privacy and calls on Congress to implement privacy legislation using the framework laid out in this report.  While the report does provide a comprehensive discussion of many of the major privacy challenges, too often the report sides with privacy advocates at the expense of competition and innovation.

One important change in the new report is that the FTC has proposed that its privacy framework apply to all commercial entities that collect or use consumer data, except those who collect data on fewer than 5,000 consumers.  The FTC exempts small businesses from the privacy framework because of the potential burden that would be imposed on them. However, larger businesses would face similar burdens and these costs would ultimately be passed on to consumers. Moreover, from a privacy perspective, this is an ineffective policy. The level of harm to a consumer from misuse of his or her personally identifiable information (PII) does not vary based on the size of the organization that misuses the information.

Similarly, the FTC proposed that first-party use of information be treated differently than third-party use of data. For example, the FTC proposed that first-party marketing be considered a commonly-accepted practice in which consumers do not need to provide choice but third-party marketing would require opt-in. This would limit the ability of companies to share data with partners or affiliates, a business practice that may be particularly important to smaller businesses that benefit from strategic partnerships. Not only may this limit some types of innovation, it also is an ineffective safeguard for consumer privacy. Again, harm to users can occur regardless of whether the entity with the information has a direct or indirect relationship to the consumer. Protecting consumers from misuse of their personal information is more important than restricting how that information is shared or obtained.

The FTC also proposed that the first-party relationship not apply to Internet Service Providers (ISPs) and their users. Such a restriction may limit the potential of ISPs to engage in online behavioral advertising. If ISPs engaged in online behavioral advertising, then they could increase their revenue and provide better quality services and lower prices to consumers. This restriction may also give an unfair competitive advantage to current online advertisers, especially those with a large user base.  A dissenting statement by Commissioner Rosch highlighted this point when he noted, “…there is no basis for requiring ISPs to use opt-in choice without requiring opt-in choice for other large platform providers. But that kind of ‘discrimination’ cannot be justified, as the Report says, because ISPs have ‘are in a position to develop highly detailed and comprehensive profiles of their customers.’ So does any large platform provider who makes available a browser or operating system to consumers.” It is important that privacy regulations are crafted so as not to limit competition and discriminate against certain technologies, business models or industries.

The FTC report outlines three main principles in the privacy framework. The first principle is that all companies implement “privacy by design.”  The purpose of privacy by design is to integrate privacy into the entire lifecycle of a product or service, rather than try to add it in at the end.  This concept has been promoted heavily by Ana Cavoukian, the Information and Privacy Commissioner for Ontario. Unfortunately, it has devolved into little more than a hot buzzword, with many privacy advocates misguidedly espousing this as a substantive solution to current privacy challenges.

Fundamentally there is nothing wrong with the concept of privacy by design. However, the underlying concept is not new or unique to privacy. Designing a product with all requirements in mind at the beginning of the development process certainly is more likely to yield a product that meets these requirements than trying to add in new features at the end. But the reality is that focusing on privacy at the beginning of a product does not always make sense.  For example, if eight out of ten starts-ups fail in the first 3 years, does it really make sense for all of these businesses to spend resources designing privacy into products or services that will not succeed?

Privacy by design is also a great example of how when buzzwords oversimplify complex subjects, they can lead to inappropriate conclusions.  For example, even though the concept may be sound, that does not mean it should be legislated. Just as Congress cannot click its heels and say “defense in depth” three times to solve the world’s cyber security challenges, neither can Congress solve the current privacy debate by simply repeating the mantra “privacy by design.” Privacy is just one of many design objectives for businesses, and none of these objectives can be directly legislated no matter how noble the cause.  If legislation could solve problems this easily, we could address a whole host of design challenges simply by passing legislation to require features like “security by design,” “cost-effectiveness by design,” “green by design,” “quality by design,” and “good design by design.”

The second principle in the FTC privacy framework is simplified consumer choice. This is a good principle, but its proposed implementation is questionable. The FTC suggests that practices can be divided into those that require consumer choice and those that do not. This may be an obtuse way of proposing that some practices should be opt-in and some should be opt-out, and the FTC will specify which of these practices fall into each category. However, many privacy policies may be complex because they involve a complex set of data, business practices, privacy protections, and relationships. Simply wishing the world was simpler does not make it so. Neither does requiring that companies pigeonhole themselves into a set of predefined, one-size-fits-all businesses practices foster innovation. While it may be important to develop a set of commonly accepted practices that use consumer data, we need a more precise and robust model for determining how to make such a ruling. Developing such a model would be a good area for a multi-stakeholder task force to explore more thoroughly.

Another area of concern in the principle of consumer choice is the FTC’s continued advocacy of Do Not Track. Do Not Track proposals are aimed squarely at reducing targeted advertising, an important and growing part of online advertising which funds much of the free content and services on the Internet. As I have discussed many, many times before, widespread adoption of Do Not Track would significantly harm the current funding mechanism for the Internet economy resulting in less free content and services (and lower quality content and services). In addition, it would result in more intrusive and less relevant advertising for consumers. Moreover, it is important to remember that targeted advertising normally does not harm user privacy in any way. The FTC would better serve consumer by exploring ways to ensure that consumer privacy is not violated rather than promoting less use of advertising practices that benefit consumers.

The third principle in the FTC privacy framework is transparency. Again, the basic principle is valid, but the proposed implementation put forward in the report overemphasizes privacy and underemphasizes costs. One specific example of this is the proposal that companies provide access to consumer data. Sometimes this might be appropriate, but many times it is not.  For example, does every online retailer really need to create an interface to its backend CRM?  Does every charity with over 5,000 donors need to provide access to the personally identifiable information it maintains about its supporters? Organizations should not have to create a system or process so that any individual can inspect what data is stored about them. The costs involved in setting up such a system would be burdensome, unnecessary, and ultimately benefit few at the expense of many. While requiring data to be available to consumers may be appropriate in certain contexts, the burden should be on policymakers to identify the circumstances under which this is appropriate. In most cases the costs probably outweigh the benefits.

In addition to the three principles, the FTC report had other faults. Given that the FTC is calling for privacy legislation, it failed to make a strong case for why new legislation is needed. For example, the FTC did not identify any clear gaps in its authority to take action against bad actors that harm consumers with regards to privacy. Instead the report mostly detailed many of the successful enforcement actions the FTC has taken against businesses that have misused personal data. The closest it came to identifying an ongoing problem was by noting that many mobile apps do not have privacy disclosures (which may not be ideal, but is not itself a harm). And even here the FTC noted that it is working with industry to develop best practices to address this concern about mobile app disclosure notices.

The FTC also suggested that legislation should be introduced to give consumers more control over data collected and used about them by “data brokers” (a term not well defined). Again, this framing presents a somewhat naïve view of privacy and suggests that consumers are more likely to be harmed if their personally identifiable information is held by a data broker rather than by another third-party. However, here again, the FTC did not identify any specific examples of ongoing harm to users that requires new privacy legislation.

The FTC also commented on the “slow pace of self-regulation.”  It is unclear what metric (if any) they used to come to this conclusion. The FTC should not conclude that a self-regulatory process is slow merely because it has yielded a different set of rules than was proposed by the FTC. As I discussed in a recent report, there are many potential benefits to self-regulation for online privacy if done properly. One of the benefits of self-regulation is that it typically can move much faster than the government regulatory process. Although some privacy advocates claim that self-regulation of online privacy is moving too slowly right now, it is worth noting that the FTC took more than a year just to update its draft privacy policy framework. During this same time period, the Digital Advertising Alliance (DAA) has launched its self-regulatory program for online behavioral advertising and committed to preventing the use of consumer data for secondary purposes like credit and employment decision.  At least comparatively, it appears the pace of self-regulation for some types of online privacy remains much faster than government regulation.

Finally, the FTC report does not address the issue of government use of personally identifiable information. If the FTC accepts the premise put forth in its report that consumer trust is necessary to stimulate commerce and the absence of privacy protections erodes consumer trust, then commercial privacy cannot be dealt with separately from government privacy.

Although there are faults with the proposed FTC privacy framework, the Commission should be commended for certain aspects of this report. For example, the report recommends avoiding duplication with existing law and applying rules consistently across both offline and online data. The report also suggests that the FTC should remain involved in industry self-regulatory efforts and that it intends to convene and facilitate new efforts at self-regulation for emerging challenges.

One positive development was the FTC’s recommendation on de-identified data. The FTC set a three-part test for whether non-public data could be reasonably linked to a particular user, computer or device.  This includes ensuring that data is de-identified, committing to not re-identifying the data, and contractually prohibiting third-parties from re-identifying the data. The FTC also noted the need for continued research and development in technology and approaches to de-identify data, a recommendation which ITIF has proposed and supported.

Another positive development was the FTC’s inclusion of an endorsement of “take-it-or-leave-it” opt-out (under certain conditions) in its privacy framework. As I have argued before, consumers should be able to choose to use or not use a service. However, a company should not be under any obligation to provide consumers the ability to opt-out of some, but not all, uses of data. For example, a company may provide a website that is free with targeted advertising. The company should not be required to allow users to opt-out of targeted advertising and still get access to content. While consumers should have choice, choices should have consequences, and consumers that opt out of targeted advertising should not receive the same benefits as those who do not opt-out. Those who opt-out but still want the same benefits are essentially free-riders.

Policymakers should remember that protecting privacy is not the same as protecting consumers. Consumers benefit from many protections, including ensuring that their data is not misused and that competitive markets produce more innovation, choice and efficiency. Although the FTC is a regulatory agency and thus should be expected to impose rules to advance its agenda of protecting consumers from unfair and deceptive practices, the Commission should recognize that privacy is but one of many values that consumers hold, and protecting privacy should be balanced against other competing interests. This does not mean that the FTC should minimize the importance of privacy, but it does mean the FTC should be cognizant and transparent about the costs of imposing privacy regulations. I hope to see further recognition of these elements in future versions of the FTC’s privacy framework.

Print Friendly

About the author

Daniel Castro is a Senior Analyst with ITIF specializing in information technology (IT) policy. His research interests include health IT, data privacy, e-commerce, e-government, electronic voting, information security and accessibility. Before joining ITIF, Mr. Castro worked as an IT analyst at the Government Accountability Office (GAO) where he audited IT security and management controls at various government agencies. He contributed to GAO reports on the state of information security at a variety of federal agencies. He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security Technology and Management from Carnegie Mellon University.