The criticism of Facebook has reached an all-time high this past week as privacy advocates have fanned the flames of discontent among Facebook users, some of who are confused and upset by recent changes in the service and new features for sharing data. This criticism centers around two new features Facebook debuted at its F8 Developer Conference in April–instant personalization and social plugins. The first feature, instant personalization, allows certain partner sites to use data from a Facebook user’s profile to customize their online experience. For example, if a Facebook user visits Pandora, a customizable Internet radio website, instant personalization will allow Pandora to create a custom radio station for the user based on their likes and dislikes from their Facebook profile. The second new feature, social plugins, allows developers to place a Facebook widget on their website so that visitors can “Like” a page or post comments. These interests can then be shown on a Facebook user’s news feed and users can see their friend’s activity. It is important to note though that websites, like the Washington Post, that use Facebook’s social plugin do not see any of the user’s personal information–but Facebook users benefit by receiving recommendations about web pages that their friends like or recommend. Both of these new features users can opt not to use.
Much of the frustration among users seems to stem from a feeling that they must continually monitor their privacy settings as Facebook introduces new features, rather than having a “set it and forget it” privacy option. However some users paradoxically complain both that there are too many controls and that they are too confusing, and that they do not have enough control over their personal information. Some users are also unhappy with recent changes that make personal information public if Facebook users choose to share it. Certainly Facebook could have done a better job of explaining its recent changes to users and helping them update their privacy settings, but many companies struggle with this challenge and Facebook seems to be getting better at this over time.
Other users are simply confused about what Facebook is doing. Contrary to some misconceptions out there, Facebook is not selling consumer data, they are selling targeted advertising. Targeted advertising works by matching ads to users based on the information in their profile. So, for example, a wedding photographer in Dallas can pay Facebook to serve an ad to everyone in Dallas who switches their relationship from “single” to “engaged.” But Facebook does this without ever revealing any personal information to the advertiser. This benefits everyone–the photographer gets more clients, the users get relevant ads, and Facebook is better able to fund its free services.
Let’s face it–when you have 400 million users, you are not going to be able to make everyone happy. But privacy fundamentalists–those individuals who value personal privacy above all other values–don’t just want to set the privacy rules just for themselves, they want to set them for everyone else. For example, Danah Boyd a fellow at Harvard’s Berkman Center for Internet and Society, claims that Facebook is a utility and should be regulated like one. Others, like the developers of Diaspora, are trying to develop an open-source social networking tool where data is decentralized and in the hands of private citizens rather than being controlled by corporations. And still others want government to pass more regulations on how companies can use consumer data. All of these individuals share at least one thing in common–they want a world where for-profit companies are not providing largely unregulated services in the information economy. While they may see this as a noble goal, the end result for the average user would be less innovation and fewer free services. Facebook, and many other useful social networking tools, would not be in existence today if they had to rely on donors and grants, instead of investors and venture capital, and if their innovations were DOA because of strict privacy regulations.
If privacy fundamentalists only cared about their own privacy, they could simply opt to not use Facebook. But these people instead see the world as one in which everyone else needs to be saved from having Facebook and its army of evil programmers sell off their personal data to the highest bidder (something Facebook is not doing). Consider a recent statement by Chris Conley at the ACLUwho said, “People are not necessarily thinking about how long this information will stick around, or how it could be used and exploited by marketer.” It is this type of paternalistic view of Internet users that is at the heart of arguments in favor of government regulation to protect consumers from themselves.
However, as we’ve written previously, Facebook is neither a right nor a necessity. Certainly others agree with this sentiment. As Betty White recently joked on SNL, “Facebook… sounds like a huge waste of time. When I was young we didn’t have Facebook, we had phone book, but you wouldn’t waste an afternoon with it.” It may be fun, useful and part of the daily routine for 50 percent of its 400 million users, but its popularity should not be grounds for government intervention.
Government intervention is not needed because there is already a simple solution–if you don’t like Facebook’s policies or services, then don’t use it. Or use it, but use Facebook’s privacy options to not share certain information. Facebook offers consumers a tradeoff–consumers can use its free service, and in return, Facebook can sell targeted advertising based on user data (which, again, is fundamentally different than selling consumers’ data). Consumers face trade-offs all the time. I personally don’t like paying $10 for popcorn at the movie theater, so I usually skip the popcorn line. Sure, I would be happier with the popcorn, but I’m not willing to make the trade-off. The trade-off Facebook offers is no different and consumers that do not want to make the trade can simply not use the service. This same idea was echoed last week by Facebook executive Elliot Schrage who (perhaps more tactfully) stated, “We are not forcing anyone to use it.”
However, for as much media attention and criticism Facebook has received over the past few weeks, most people seem content to remain with Facebook. Some people will leave Facebook; indeed, some already have. For example, Leo Laporte the host of This Week in Technology generated much media attention by deleting his Facebook account during last Wednesday’s podcast. But this does not seem to be catching on. The “Quit Facebook Day” website has attracted fewer than 3,000 pledges. And while some individuals (mostly privacy fundamentalists) will choose to delete their account (if they were using Facebook to begin with), for most users, the benefits of Facebook still seem to outweigh the costs of using it. Why? Because it is a great tool for making and keeping connections and most users get a lot of value out of using it.
This is certainly not the first time that Facebook has been faced with criticism about its service. In 2006, Facebook users protested the introduction of the “News Feed” which showed status updates from Facebook friends, yet a few years later, the News Feed is now a standard feature embraced by users. Even one of the leaders of the protest from 2006, has come to Facebook’s defense now arguing that “Facebook…is the wrong target for our anger. It has done more to bring people together than any technology of the last five years, and the good it has brought far outweighs the bad. We made the decision to turn our personal information over to a private company, and for the most part Facebook made good use of it.”
Social media is encouraging people to be more open about their lives. Some social networking tools like Twitter even make data public by default. But individuals still have control over what information they share and with whom they share it. Moreover, no user is forced to use a social networking tool. Facebook should not be criticized for trying to monetize its business model, which yes, involves serving users targeted ads based on their profile and personal data and getting as many users to join by offering innovative new features. But it is not selling user data to advertisers and it does not plan to.
Moreover, Facebook is not the enemy. With or without Facebook, people will continue to share information and use personal data information for work and for play, and users will have to learn to be responsible for their own actions and aware of what they do online. Some users may not like some of Facebook’s recent changes, but it is not the bogeyman privacy fundamentalists have made it out to be and it should not be regulated like one. By and large, the changes Facebook has made have been designed to create a more useful and interesting Internet experience for its users. Before rushing to regulate social networks, policymakers should remember that not only do applications like Facebook provide users with real value in ways that do not violate their privacy, but that users have many ways of controlling their privacy on these networks.
Privacy fundamentalist will continue to insist that the government implement data privacy regulations, partly on the basis that consumers need to be protected from their own choices and that the kind of mass customization that new Facebook applications enable are not needed. But this kind of paternalism is not needed. People have learned in the offline world how to protect their privacy (e.g., they close the drapes at night before undressing), and they are learning it in the online world by not sharing sensitive information and controlling their privacy settings (and yes, exhibitionists both offline and online can share what you and I might consider too much information). Moreover, stringent new privacy regulations that effectively neuter all useful information sharing on the Internet would not just hurt Facebook (a U.S. company, it should be noted, that sells its services around the globe, creating jobs and export revenue for the U.S. economy), it would also stall future innovation and possibly eliminate many of the useful applications that people use today.