Captive Policy

Captive Audience

Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age, Susan Crawford’s long-awaited tome on the Comcast/NBC merger, will be officially released tomorrow in what may be the publishing industry’s biggest anticlimax of the year. The book has an engaging, novelistic style, offering a gritty description of the atmosphere and the players at a Senate Antitrust Subcommittee hearing on the merger in early 2010, but the gripping drama of the Congressional hearing doesn’t excuse Crawford’s shallow analysis of its subject matter.

Beyond the lively exposition of the hearing, the book relies on a series of anecdotes and strained historical analogies – and precious little hard data – to make a case for Crawford’s pet policy prescription, “broadband unbundling.” Unbundling would be a new status quo, reducing broadband service providers to the smallest possible role in communications services, that of a wiring plant maintainer lacking the power to move a single bit on its own. This would put both the financial and the technical bases of broadband in a severe bind, of course. Experiments with unbundling have shown that it can reduce consumer prices only temporarily, at the expense of ongoing investment in upgrades to network wiring and electronics. Deferred investment creates a series of deferred crises that can only be resolved by repeated taxpayer intervention on an uncertain schedule dictated by political factors.

The book illustrates the pitfalls inherent in communicating through ink and paper in the fast-moving broadband world: Since the book was committed to print, the Europeans that Crawford holds out as models of the benefits of unbundling are reported to be raising prices for wholesale broadband to stimulate investment in faster networks in order to catch up with the U. S.:

Only 2 percent of the households in the European Union have access to broadband download speeds of 100 megabits per second or greater. In the United States, by comparison, at least 50 million homes, or nearly half, are connected to networks with speeds of at least 100 megabits per second. By 2020, the commission’s goal is for half of E.U. households to have access to this service. Meanwhile, only half of E.U. households have service at 30 megabits per second. By 2020, the commission wants all E.U. households to have this as an option.

This comes as no surprise, given that European telecom operators sought to sweeten the 2012 WCIT treaty with fee-capturing features to stem their financial losses. At the urging of equipment vendors, Europe is also considering the outright abandonment of unbundling in favor of the facilities-based competition policy we have in the United States:

In the latest interview, [Alcatel-Lucent CEO Ben] Verwaayen said a number of factors would need to change to enable Europe to get up to speed with the U.S. and other regions in terms of telecom technology…

“We need to bring competition to where it really matters and that’s in choice and at the moment the regulator decides my choice and I don’t think that’s great,” he said…

He added that Europe lacked an ability to harness competition and talent and this was leaving it in the slow lane.

One of the issues the Europeans have discovered is the difficulty of deploying next-generation 100 Mbps Vectored DSL over unbundled wires:

[Deutsche Telekom] has lost billions on expansion into Greece, just wrote off eight billion from T-Mobile USA, and has huge losses, easily five billion and probably more, expanding into computer services (T-Systems). DT is demanding an end to unbundling, claiming it’s technically impossible to vector if some lines are unbundled.

Vectored DSL raises issues similar to the ones that arise for unbundled cable: Bit transmission is highly dependent on the behavior of adjacent channels or wire pairs in a cable sheath, so inserting a service firewall slows access for everyone. Competition between networks bound to particular cable plants (DSL vs. Cable DOCCIS) is feasible, but sharing a cable plant between multiple ISPs requires a very high degree of technical and regulatory coordination, which is increasingly infeasible as DSL and cable technologies benefit from Moore’s Law become more sophisticated. As with wireless, we can have sharing or we can have high performance, but we can’t have both.

Crawford argues that cable has achieved a monopoly position over DSL by virtue of higher speed, but also claims that cable isn’t fast enough. She’d like to see each American home equipped with fiber optics providing 1 Gigabit/second in both the upload and download directions, something Google doesn’t even offer in Kansas City:

In most of Comcast’s market territories, it was the only high-speed access provider selling services at speeds that would be sufficient to satisfy Americans’ requirements in the near future. But the access Comcast sold was less useful than it could have been because the network had been designed to be contested among users in the same neighborhood (making speeds unreliable) and favored passive consuming uses (downloads) far more than active uploads. Meanwhile, the service that all Americans would need within five years (truly high-speed Internet access ranging from 100 Mbps, or megabits per second, to gigabit speeds over fiber-optic lines), the service that would allow symmetrical (same-speed) uploads and downloads and extensive use of online streaming video for a host of educational, medical, and economic purposes, was routinely available in other countries but could not be purchased at all in most parts of the United States. [Crawford, Susan P. (2013-01-08). Captive Audience (Kindle Locations 50-57). Yale University Press. Kindle Edition.]

Crawford is clearly incorrect in asserting that the service she wants to see is “routinely available in other countries,” of course. Even worse than the wishful thinking is her studied refusal to describe the actual applications that would be enabled by symmetrical service at gigbit speeds that can’t be used today.

There’s clearly no shortage of streaming video on the Internet today, as most studies find that video streaming represents half the traffic on the American Internet during prime time today. We’ve had gigabit networks in offices, homes, and campuses for nearly ten years now, and we haven’t seen any new applications too demanding for common 100 Mbps networks but happy with gigabit connections.

In real life, the requirements of video streaming are many times more modest than Crawford imagines: Netflix reports that their average streaming rate over Google’s gigabit Kansas City network is a meager 2.5 Mbps, a figure that’s well within the capability of all but the poorest DSL connections and no problem at all for Verizon’s FiOS, cable’s DOCSIS, AT&T’s and Century Link’s VDSL+, and even 4G LTE wireless.

Consumers in most parts of America can in fact purchase 100 Mbps service over FiOS and cable today, and should be able to buy it from Vectored DSL services in fairly short order, but they’re still bewildered about what they’re going to do with it. The likely answers will come from corners that aren’t too hard to imagine, such as high-resolution multiparty video calls, virtual classrooms, and even holographic conferencing. But these aren’t pressing concerns today and they won’t be for many years to come; they’re certainly unlikely to jump to the top of consumer concerns within five years, but if they do, we have the technology to address them on the networks that are currently installed.

American broadband policy does need to be continually reviewed, updated, and revised. The rates of computer ownership and literacy are still too low, consumer awareness of the benefits of broadband is still too narrowly distributed, broadband service in rural areas is still too spotty, and it may be too difficult  for upstart broadband suppliers to obtain access to rights of way (although metro Ethernet doesn’t seem to have much of a problem with it.)

Policies that were considered, tried, and discarded fifteen and twenty years ago won’t alleviate these problems, however, and by now Crawford should know that. It’s particularly disappointing that Crawford refuses to acknowledge America’s leadership in the adoption of next generation wireless LTE. She sneers at mobile as an inferior substitute for cable without giving any credit to the benefits of mobile computing. It’s as if she doesn’t want her foreordained conclusion perturbed by mere facts.

Print Friendly

About the author

Richard Bennett is an ITIF Senior Research Fellow specializing in broadband networking and Internet policy. He has a 30 year background in network engineering and standards. He was vice-chair of the IEEE 802.3 task group that devised the original Ethernet over Twisted Pair standard, and has contributed to Wi-Fi standards for fifteen years. He was active in OSI, the instigator of RFC 1001, and founder, along with Bob Metcalfe, of the Open Token Foundation, the first network industry alliance to operate an interoperability lab. He has worked for leading applied research labs, where portions of his work were underwritten by DARPA. Richard is also the inventor of four networking patents and a member of the BITAG Technical Working Group.
  • http://www.ivpcapital.com/blog Michael Elling

    Richard, what if you are both wrong? Perhaps what we’re analyzing is a 20 or 50 year struggle between vertical integration and horizontal scaling. I think her horizontal model and approach is half-baked. She needs to at least develop it across lower, middle and upper layers. The best examples where we see horizontal winning over vertical are TCP vs OSI and Ethernet in Layer 2 and 802.11 vs LTE when viewed from a functionality and performance/price perspective. Unfortunately people are confused by semantics and inherent biases towards vertical integration.

    The latter actually does not work, especially in networks, because vertical integration does not efficiently clear rapidly obsoleting supply across constantly shifting and growing demand. Google Fiber should prove that scaling at every vertical layer and horizontal boundary point such that pricing reflects marginal cost.

    What is not apparent to many, as it is the least obvious and visible is settlement, control and security systems in the middle layers. Google may or may not get it right, as they have a bill and keep mentality both at the top and bottom layers. Balanced settlements, from upper to lower layers and across networks is critical to new service introduction and rapid scaling. Bill and keep is a recipe for stagnation and monopoly/incumbent control. Unfortunately, both the service provider and internet incumbents, and increasingly the regulators, are keen on the latter.