The Wall Street Journal reported Wednesday that FCC Chairman Tom Wheeler was set to circulate a draft of rules to be proposed to codify net neutrality. This initial story prompted a sudden outpouring of inaccurate reporting and misplaced vitriol. There are a number of unfortunate misunderstandings ITIF would like to help clear up in the hopes that Chairman Wheeler is not deterred from what is a very reasonable approach to a difficult policy problem. The Chairman took to the FCC blog on Thursday to try to “set the record straight,” unfortunately, with today’s powerful echo-chambers and viral proliferation of over-reactions, setting the record straight is a very difficult task.
Chairman Wheeler explained that he intends to propose rules that will allow for a case-by-case analysis of traffic management, allowing practices that are “commercially reasonable” to all on reasonable terms. Any type of practice that harms competition or consumers as a result of abuse of market power would be prohibited. ITIF has long been an advocate for these types of restrictions on broadband providers.
We believe the general direction of Wheeler’s proposals to be a good balance between protecting consumers and allowing innovative business models and communication applications. Of course the actual notice of the proposed rules is not yet released, and the devil is always in the details, but a case-by-case approach allowing reasonable prioritization of traffic is to be preferred.
Below we correct a number of key misunderstandings that continue to persist despite the Chairman’s clarifications.
1. Internet “Throttling” – Some stories of the proposal claim that it would result in a network where some traffic flows smoothly while others would face “frozen or frustrating” video streams. But there is no reason to think that broadband ISPs have any interest in actively slowing Internet traffic, and even less reason to think such a practice would be allowed under the proposed rules. There is an idea circulating that the proposed rules would turn the Internet into one big toll road, allowing broadband providers to slow traffic to a crawl unless services pay up – this is absolutely false. All consumers would continue to enjoy at minimum what is known today as “best efforts Internet” service.
Consumers demand a free and open Internet when they sign up for broadband, and ISPs want to meet that demand. The proliferation of new services like streaming video is what drives customers to these high-speed networks, and operators have no reason to stifle this demand.
Furthermore, the Commission’s proposed rules would certainly prevent such behavior. Any reduction in performance for a particular website or service would be considered anti-competitive and not “commercially reasonable,” and therefore prohibited. Similarly, an ISP would not be allowed to favor its own traffic over that of others. The fact is there have only been four reported cases of such a practice in the United States, with only one of them involving actual misbehavior (the Madison River case). The FCC immediately resolved these problems, even without the presence of formal Net Neutrality rules, suggesting that this concern is a red herring.
In short, “pay-to-play” would not be allowed – no one would have to pay a fee in order to get their content or service to consumers. What would be allowed is “pay-to-improve” – under these proposed rules, Internet services would only be allowed to get better, not worse.
2. Prioritized Content – Many reports indicate that ISPs will require payment for access to their “fast-lanes” and any type of traffic that gets stuck in the “slow-lane” will suffer. This is inaccurate. The vast majority of Internet traffic will not have a need to be prioritized and will continue to enjoy the current “best efforts” Internet. Services like email, web browsing, video downloads, and most video streaming will likely be unchanged. The applications that will benefit most from the new rules are real-time, latency sensitive communications like video conferencing. Regular video streaming, like that provided by Netflix or YouTube, is unlikely to require any kind of prioritization. But some content providers could conceivably choose to pay extra to be able to offer extraordinarily high resolution video streaming with minimal buffering.
Yet opponents argue that the FCC must prevent a variety of service levels on the Internet. In fact, that is functionally what we have always had. Many sites pay content delivery networks (CDNs) to have their data stored near the end consumer, usually because their applications are bandwidth intensive. Other sites, like ITIF.org, do not. But if Heritage Foundation or Center for American Progress wanted to, they could pay to have their PDF reports housed on servers close to end users around the nation. The reason they don’t is that no one really cares if a PDF loads in 3 seconds or 5 seconds. Moreover, we already have fast and slow lanes – some consumers simply choose to subscribe to slow speeds and others to faster speeds. Most consumers can subscribe with their ISP to a variety of different speed tiers at home, paying more for faster ones and less for slower ones. All the Chairman’s proposal would do is allow some traffic to perform better, not allow some traffic to perform worse.
3. Favoring some traffic means shunting other traffic to the slow lane. This is simply not true and reflects an ignorance of how Internet architecture is designed. Network management is not simple: we are not talking about having a “fast lane” and a “slow lane.” When the network itself can correctly determine an application’s needs, it can allocate resources to produce the greatest good for all applications. Allowing the last mile network to understand the technical requirements of an application’s packets would improve efficiency of the network overall. To claim that some would be “left buffering” belies a total misunderstanding of how packet discrimination would function and the efficiencies it would enable.
4. The past and present Internet treats all packets the same. Advocates of net neutrality stress, correctly, that differentiated Type of Service, a piece of information within each packet that could enable the network to know what type of application the packet is for, has not been a major element, in practice, of carriage agreements among network operators. However, organizations often employ Type of Service (and its successor, Differentiated Services) inside their own private networks, but pass packets to other networks according to the default Type of Service class known as “Best Efforts.” There has been a historic difference between Internet architecture and Internet practice, with practice lagging behind architecture.
Advocates of strong net neutrality regulations insist that there is no good reason to permit network interconnection agreements to honor Type of Service in the future because it hasn’t been necessary in the past. In many instances, “preserving the Internet’s openness” means little more than “banning Type of Service differentiation for a fee.” One problem with such a ban on using Type of Service when handing off traffic is that it would limit the Internet to a set of requirements established by the applications and routing technology of the past. A web page that loads in two seconds is generally considered successful; for some other types of applications (such as real-time securities trading systems and certain types of device control, for example) an end-to-end delay of more than several milliseconds per packet is considered fatal. With new cloud-based services coming online every day, the number of applications that would benefit from specialized networking will continue to increase.
This need for prioritization and differential treatment is exactly why leading IT networking companies (not ISPs) are working on home wireless routers and gateways that will let consumers prioritize the traffic that runs inside their homes. Allowing differentiation based on type of service throughout the entire network, not just in backhaul and inside the home, but banning it in the last mile, makes little sense.
5. Consumer Harm. Some opponents argue that these changes “don’t bode well for consumers.” One reason is that they mistakenly believe that it will lead to throttling or a reduction of best efforts Internet service, which as we describe above is simply false. The second reason they assert is that if application providers choose to pay for better service, that these costs will simply pass on these costs to consumers. In competitive markets of course they will – there is no free lunch – but they will also be passing on higher quality to consumers, and only to the ones who choose to buy the higher quality services. This is a bit like saying the sale of two different types of televisions, for example, one that is a smart TV for more bit more money than the regular TV, is hurting consumers because the price of the smart TV is higher than the regular TV. If consumers choose to pay more it’s because they will value the superior service. But for these so-called “consumer advocates,” the ones charging for better service will pass along the costs to consumers, yet they reject the idea that the ones receiving the money – the ISPs – will use these revenues to reinvest in the network to provide even faster and higher quality broadband to enable these services.
Again, if, as described above, ISPs were found to be abusing these rules to hurt consumers, the rules would be very clearly designed to give the FCC authority to step in and stop any business practice that would actually hurt consumers or competition.
6. Squash Innovation– A common argument made to attack the proposal is that allowing any kind of discrimination of packets will squash innovation, preventing the next Google or Netflix. In fact, the opposite is true. First, as noted above, the only type of company likely to avail themselves of differentiated services are those providing latency sensitive applications or very high definition, high bandwidth streaming. The vast majority of applications from startups (or other companies) would do just fine using Best Efforts Internet, just as they do today. But for those companies wanting to innovate and provide a new service that requires low latency (or low jitter), in fact these rules would enhance innovation because they would enable the company to get the kind of delivery it needs to make its application work well.
In addition, while a differentiated pricing model is likely to raise the price for voluntarily chosen expedited delivery by the ISP, it wouldn’t necessarily raise the end-to-end price (taking alternatives such as CDNs and overlay networks into account) for Internet traffic generally; it’s also likely to lower the price for many applications. The Internet isn’t a single, uniform network in which all fees are collected by the same firm; it’s a federation of networks, many of whom compete with each other for IP transport. There are multiple providers of expedited IP carriage today, and allowing the ISPs to enter this market to provide a service that application or content providers voluntarily choose to purchase to receive higher quality service increases consumer choice. Single-service level differentiated pricing goes both ways, and fees charged by ISPs to application providers are only part of the overall pricing schemes that pertain to the Internet. Content delivery networks don’t give their services away for free.
Finally, for the reasons outlined above, the types of prioritization arrangements that would pass muster under a “commercially reasonable” test would be overwhelmingly welfare maximizing, potentially unlocking totally new services. And again, any service would likely have to be made available to competitors on similar terms.
7. The complaint process will be arduous – Some reports are speculating that the enforcement process would be costly and difficult for regular consumers or small businesses to navigate. We know nothing what kind of enforcement mechanism the Commission would put in place, but I think it very unlikely the complaint process would be burdensome. It is very unlikely that any complaint process would require a formal lawsuit. Instead, the Commission would probably want to handle complaints internally. Considering the political sensitivity of the issue, the Commission is likely to be very vigilant in policing these rules. Furthermore, with sharp glare of the media spotlight and millions of consumers concerned about these issues, legitimate grievances will surface very quickly and demand swift resolution.
8. Better to go back to Title II – Rather than accept innovation in the core network, opponents of the rule would like to ban it, returning broadband to Title II. But this ignores the fact that Title II services also allowed providers to purchase better quality services, like Frame Relay. Moreover, classifying broadband as common carrier would be an extreme and unwise step backwards. Title II classification would necessitate an abandonment of inter-modal competition and force providers into an arcane rate setting process inevitably reducing investment in our networks and the potential for innovative new broadband technologies. There would be serious repercussions to classifying broadband as a common carrier, and the Chairman is right to avoid this route.
Not only that, but Title II would likely be worse at protecting consumers from anti-competitive practices than the proposed rules. As Wheeler points out in his blog, Title II “only bans ‘unjust and unreasonable discrimination’” – the Commission would have to show that a discrimination practice is “unjust and unreasonable” or gives “undue or unreasonable” preferences (47 U.S.C. 202). Wholesale Internet access would likely still be considered an information service under Title II, and the same sort of price discrimination and two-sided markets some fear would be allowed. The Commission would have less flexibility in crafting an appropriate balance of beneficial traffic management and consumer protection. Enforcement under this standard would likely be much more arduous than under the Commission’s new rules.
In short, the Internet is a content network today, but we need it become a truly general-purpose network in the future. We should want an Internet that is as efficient when transferring files as it is when providing extra-low-delay packet transport to video callers, television channel surfers, telemedicine participants, gamers, and security service providers. As a result, the correct way to draft a non-discrimination rule is to focus on discriminatory market practices, as Chairman Wheeler has in fact done, not on engineering practices that are either discriminatory or not depending on one’s viewpoint on nuanced aspects of network design and operation.
The Commission’s proposed rules will likely allow for a much more subtle and tailored approach that can better protect consumers. At the very least, we should all calm down until we get the actual proposal. There will be plenty of time to work out the details in the coming months.