California AG’s Mobile Ecosystem Report Not the Worst

Privacy on the Go report cover

Today the California Attorney General’s office released a report “Privacy on the Go: Recommendations for the Mobile Ecosystem” that lays out a set of recommended practices for app developers, app platform providers, ad networks and others that it thinks will enhance privacy for consumers. This comes on the heels of an earlier agreement last year with Amazon, Apple, Google, HP, Microsoft and RIM to ensure that all mobile apps that collect personally identifiable information have a privacy policy. Readers of this blog will know that I am a frequent critic of proposals to impose additional privacy rules and regulations when industry-led self-regulatory efforts would be more efficient and effective. So in this case I am rather pleased to see that the California AG has steered away from additional lawsuits (at least for now) and instead has offered voluntary best practices for consumer privacy in mobile apps. Many of these recommended practices are relatively simple and commonsense ideas, such a “use encryption in the transit and storage of personally identifiable data” and “post or link the [privacy] policy on the app platform page.”

In many ways this is a step in the right direction. If privacy advocates want all developers to implement more privacy controls, one way to incentivize this is to take steps to reduce the costs involved in doing so. And while I don’t agree with all of the recommendations, by providing some official guidance to developers, this may help lower the cost of creating mobile apps with additional privacy controls. (Although it may just further muddy the waters since various other groups, including W3C, GSMA, CDT, EFF, have put out their own set of mobile privacy best practices).

But in other ways these recommendations highlight the same fallacies frequently put forward by privacy advocates. It’s worth noting these because it seems that the strategy of privacy advocates is that if they repeat something enough times people will start to accept it as fact. In an effort to call these out, let me highlight some of the main problems I see in this report.

Myth #1: Privacy controls are needed to protect economic growth

Privacy advocates often argue that privacy controls are necessary to restore/improve/sustain consumer trust and thereby ensure economic growth online. This report trots out this familiar argument in the opening pages when it claims that these recommended privacy practices are needed to ”foster trust and confidence in this market.” As evidence of this, the report cites the recent Pew study, which  found that approximately half of Internet users have either uninstalled or decided not to install an app because of the type of data it collected.

I find it troubling that this is cited as evidence of a problem. First, if a large percentage of users are not installing certain apps because of the type of data they collect, this means that users are willing and able to send clear signals to developers about their privacy preferences. This appears to be a healthy example of the free market at work. If half of car buyers reported that they did not buy a particular car, or later traded it in, because of how much gas it uses, would this be seen as a government problem?

Second, it’s not clear what type of response would not be seen by privacy advocates as evidence of a problem. After all, this survey could also be interpreted as saying that almost half of respondents have never uninstalled or not installed an app because of privacy concerns. (And for those that responded positively to the question, we don’t know if this is something that happened rarely, sometimes, or frequently.)

Moreover privacy rules and controls are often a barrier to growth in the Internet economy. For example, we know that privacy rules for online advertising in Europe have had a negative impact on ad effectiveness, particularly on general content sites like news and web services. This in turn will have a negative impact on the revenue available for these websites. As MIT economist Catherine Tucker has noted “regulation is a trade-off between the benefits of consumer privacy and the benefits to consumers of a potentially broader, less obtrusive advertising-supported Internet.”

Myth #2: Advertising is not part of the app’s basic functionality

For all of the talk in this report about a “mobile ecosystem” the report authors apparently do not seem to clearly understand that this ecosystem depends on revenue. Even as the report agrees that “a common business model for mobile apps today is based on delivering targeted advertising”, it argues that targeted advertising is not part of an app’s basic functionality and notes that this same argument is made in the FTC’s mostly recently revised COPPA rules. However, many mobile apps are ad-supported software (i.e. adware). As acknowledged in the endnotes of the report, approximately a quarter of app revenue comes from advertising, while the rest of it comes from app sales (about a quarter) and in-app purchases (the remaining half). Since targeted advertising is an integral part of how apps produce revenue, it is simply ridiculous to claim that the targeted advertising portion of the software is not part of the app’s basic functionality. This would be like claiming that in-app purchases are not part of a mobile app’s basic functionality. Or that digital rights management (DRM) technology, used to prevent piracy and to ensure that software is only used by licensed users, is not part of paid software. Yet the report claims that only contextual advertising and payment processing should be considered part of an app’s basic functionality and demands different treatment of data used for targeted advertising.

Myth #3: Additional privacy controls have no economic cost

The report also ignores the costs of privacy controls, not only of compliance but also of the broader economic cost to the mobile ecosystem. Anyone who has ever done any kind of software development knows that you must pay for every feature in an app—code does not get written for free. So it’s interesting that the California AG’s report is devoid of any mention about the costs involved in meeting the various “best practices.” Not only does this minimize the serious concerns that developers may have about the costs of compliance, it raises important questions about whether the AG even considered the economic impact of these proposals. For example, although the AG claims that these are best practices, there is no evidence that the proposed recommendations are actually the most efficient way to protect consumer privacy.

The report also unintentionally shows how ever more complex privacy rules are creating additional demands on developers to the point that anyone who plans to be a developer should probably first get a degree in privacy law. For example, in addition to calling for every employee in a company making mobile apps to receive annual training on privacy practices, the report states that developers should have someone on the team be in charge of reading up on all new privacy laws and regulations domestically and internationally. This might be feasible for a larger company that develops apps, but this is asking a lot of the one-man-shop developers, part-timers, hobbyists, and students who create many of the interesting apps that make up the mobile ecosystem.

Finally, the California AG’s report completely ignores the additional privacy challenges that developers face providing apps for children. As I have noted in my COPPA filing, the current set of rules for children’s privacy have had a detrimental effect on the availability of high-quality, ad-supported content for children on the Internet. The latest rules threaten to take this further and diminish the availability of ad-supported mobile apps. Rather than provide any meaningful suggestions, the report blandly states: “If your app is directed to children under the age of 13 or if you know that you are collecting personal information from children under the age of 13, you may have additional obligations under federal law.” If the purpose of this report was to make it easier for developers to make apps that comply with privacy rules, the report leaves a gaping hole in its mission by omitting rules for apps directed at children.

In summary, the California AG’s report, by offering recommendations that can serve as a useful reference for developers rather than mandates that constrain innovation, is a step in the right direction. However, it’s important that the some of the hidden arguments in the report are understood and debated rather than accepted at face value.

Print Friendly

About the author

Daniel Castro is a Senior Analyst with ITIF specializing in information technology (IT) policy. His research interests include health IT, data privacy, e-commerce, e-government, electronic voting, information security and accessibility. Before joining ITIF, Mr. Castro worked as an IT analyst at the Government Accountability Office (GAO) where he audited IT security and management controls at various government agencies. He contributed to GAO reports on the state of information security at a variety of federal agencies. He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security Technology and Management from Carnegie Mellon University.