Sparks Will Fly at Spectrum Incentive Auctions Hearing

Cold Dead Hands

Tomorrow the House Energy and Commerce Committee will hold a hearing with all five FCC commissioners to examine the upcoming spectrum incentive auction. The committee memo on the hearing says the two main issues to be examined are “unlicensed spectrum and bidder eligibility,” two areas of perpetual friction between the Committee’s Republican leadership and the Democratic majority at the FCC. The auction was authorized by the 2012 Public Safety and Spectrum Act, so the Commission is required to abide by its conditions, and the Committee isn’t at all convinced that the FCC has its heart in the right place.

The two key issues are among the most contentious issues in U. S. spectrum policy because they deal with the first order allocation of spectrum in the civilian sector between free, unlicensed uses and fee-based, licensed ones on the one hand, and limits on licensed spectrum holdings on the other. So the structure of the auction will determine how much of the spectrum relinquished by TV broadcasters will go to licensed uses and who can buy the licenses.

The FCC is under pressure to increase allocations for unlicensed beyond the dictates of the Act. There’s a strong contingent of White Spaces advocates, strong believers in the notion that innovation depends on unlicensed to a much greater extent than on licensed, who look forward to a vast new expanse of creative opportunity if firms are able to build large-scale, public networks without a need to pay license fees.

The idea is appealing: If a network service provider can spend all of its money on equipment and personnel, it can in principle build a better system than it can by dividing its funds up between equipment, personnel, and license fees. But on the other hand, there is a strong reason to question whether firms are going to spend much on infrastructure if they must perpetually contend for spectrum access, second by second, with other firms. The business case for White Spaces is strongest in rural areas where each network provider can be reasonably confident of exclusive use of the spectrum, but there is also something to be said for its use to provide narrow-band service for machine-to-machine applications, but only so long as the operator can assume reliable access. This niche was identified by the UK telecom regulator, Ofcom, and is now being exploited by former Ofcom staff in new lives in the private sector.

The FCC has taken to describing White Spaces as a “Wi-Fi like use,” which is unfortunate because it doesn’t do justice to either Wi-Fi or White Spaces. Wi-Fi is a short-distance, Local Area Network that’s completely suitable for replacing an Ethernet cable in a home or office, while White Spaces aspires to fill a completely different niche for wide-area networks that provide machine-to-machine communication over narrow bands of spectrum that aren’t suitable for other purposes.

This concept holds promise for a set of applications that aren’t well served with other networks. It’s primarily a means of making productive use of spectrum that would otherwise go to waste as “guard bands” between licensed uses. The key insight behind White Spaces is the notion that small bands of spectrum can be used at relatively low power levels compared to licensed uses for systems with modest bandwidth requirements. White Spaces is new, however, so it may come to fulfill other applications as well.

There was once a belief that White Spaces would provide a “third pipe” for home Internet connections, but that idea has pretty much fallen by the wayside as LTE has emerged as the potential third, fourth, fifth, and sixth pipe for residential broadband in rural areas after the Big Four national cellular carriers complete the rollout in rural areas. But White Spaces is still a promising idea that doesn’t need to be oversold or over-allocated.

Meanwhile, it’s certainly the case that Wi-Fi is an important element of the broadband ecosystem that’s going to provide service to an increasing number of users in an increasing number of places. We currently experience Wi-Fi congestion in places like apartment buildings, stadiums, conference centers, and airports because Wi-Fi systems don’t coordinate well with each other. The solution to this problem isn’t allocating even more spectrum to Wi-Fi, as it already has access to roughly 400 MHz, more than the Big Four cellular networks have.

The solution lies in better systems of technical coordination. Part of this requires a means of restricting Wi-Fi use to the modern systems that use spectrum more efficiently than the Wi-Fi legacy systems of the 1990s. The White Spaces database can fill a role here, by providing a means to restrict access to systems of the appropriate level of technology. There’s not a need for more spectrum for Wi-Fi, there’s a need for less waste.

The question of spectrum caps between carriers is a hard one to resolve. On the one hand, policy makers want to create the conditions for a robust, competitive marketplace for cellular and mobile broadband service by ensuring that the Big Four and their regional complements have enough spectrum to provide reasonable service. But on the other, there’s no denying the fact that the firms that invest most heavily in networks and advanced handsets are going to win more customers, so restricting the spectrum available to the most successful competitors has the effect of penalizing the behavior that we want to encourage: investment, technology upgrades, and low prices.

The goal of competition policy is in fact focused on these three things, so it can’t be sensible to artificially restrict the spectrum available to Verizon and AT&T in the name of consumer welfare to such an extent that consumers suffer poor performance and high prices because of spectrum scarcity. This is a question of balance, where the metric that counts is consumer welfare indexes that we can measure. That analysis suggests that the use of spectrum screens in local markets is the key to allocation between firms, with a secondary market and a roaming market that can deal with fluctuations in usage.

This hearing takes place under a cloud. Its focus is the various ways that the FCC can divvy up the market for civilian spectrum, but it ignores the fact that something like 60% of the spectrum that could be used for mobile broadband, Wi-Fi, and White Spaces is currently in government hands. We had hoped that the PCAST report on spectrum use by the federal government would put us on the path of the kind of “Digital Dividend” we had when TV was converted from analog to HDTV. The HDTV conversion gave us better picture quality and more programming in half the spectrum footprint the analog TV system had, and created the opportunity for White Spaces and 4G networks in the vacated spaces.

There is every reason to believe that a similar “Digital Dividend” is possible when federal spectrum systems are upgraded and rationalized, a process that would certainly move a number of government systems to commercial networks.  The Defense Department is fighting the second digital dividend because they have their own ideas about how to use the airwaves, of course. Unfortunately, the PCAST advisors buckled under to DoD pressure and gave the Pentagon the report they wanted instead of giving we civilians the one that we needed.

So the battle for civilian spectrum will continue to rage, and tomorrow’s hearing will be intense. But we can’t help but wonder how much better this discussion would go if we were able to pry a sizable amount of spectrum from the cold, arthritic fingers of the Department of Defense, to use a figure of speech, instead of fighting over the table scraps that fall from the Pentagon’s spectrum banquet.

Print Friendly

About the author

Richard Bennett is an ITIF Senior Research Fellow specializing in broadband networking and Internet policy. He has a 30 year background in network engineering and standards. He was vice-chair of the IEEE 802.3 task group that devised the original Ethernet over Twisted Pair standard, and has contributed to Wi-Fi standards for fifteen years. He was active in OSI, the instigator of RFC 1001, and founder, along with Bob Metcalfe, of the Open Token Foundation, the first network industry alliance to operate an interoperability lab. He has worked for leading applied research labs, where portions of his work were underwritten by DARPA. Richard is also the inventor of four networking patents and a member of the BITAG Technical Working Group.