FCC Recognizes the Importance of Receivers

Early this week, the FCC held a workshop on receiver-oriented radio regulation that could mark a turning point in the way the agency looks at spectrum. By way of background, the traditional way the FCC (as well as its counterparts in other nations) regulates spectrum is by imposing rules and conditions on the signals a service is allowed to transmit. This approach has simplicity in its favor, as the nature of transmitted signals is easy to measure.

A transmit testbench simply senses the energy levels and signal characteristics the service emits, and then compares them to regulations. If the signal power is low enough at permitted frequencies, the service is considered lawful and everyone is happy. Well, it used to be the case that everyone was happy. We’ve found that this approach is only workable when the transmitter regulations correctly ensure that nearby services aren’t actually affected by the signal in question.

The problem that we’ve come to appreciate in the cases where a traditional application such as mobile satellite services (MSS) is replaced by a new service such as, say, LTE mobile broadband, is that radio interference has much less to do with what’s transmitted than with what’s received.

Nobody likes a noisy neighbor, and we have well-established systems for dealing with them. If the person in the next apartment is playing loud music after midnight, a call to the apartment manager will have the effect of getting the volume turned down. The acceptable noise level is assumed to be some number of decibels of noise that would interfere with the average person’s ability to sleep without earplugs, as measured by the apartment manager’s own ears.

In radio systems, it can be difficult to determine this level, however. If your neighbors are deaf, you can crank up the volume and nobody complains. But what happens if your neighbor has especially sensitive hearing? You listen to music at a level that you believe won’t disturb most people, yet the apartment manager is at your door with an angry look in his eyes demanding you turn it down. If it turns out that your neighbor has a glass to the wall so he can eavesdrop on you, it might not take much volume to disturb his hearing.

As absurd as this sounds, a similar case was recently decided by the FCC in which the complaining party (certain GPS companies) was actually eavesdropping on spectrum that they weren’t supposed to be receiving. The consequences of this case are potentially very severe to the transmitting party, LightSquared, because it determines the difference between their being able to operate a potentially viable business and a multi-million dollar bankruptcy.

The discourse on this case uses the term “interference,” but the term is misleading. LTE and similar mobile broadband systems are operated worldwide without any credible complaints about “interference” to neighboring services. Any LTE service should be allowed to transmit at essentially the same signal level wherever it’s located in the frequency chart, regardless of who its neighbors are and how acute their hearing is, and every non-LTE service should be tolerant of a neighboring LTE service.

This hasn’t always been the case, and it won’t always be the case in the future, but for the next ten years it’s a pretty good bet. So how do we achieve this through regulation?

It’s not easy. Regulating transmitters is easy, and that’s why it’s the current norm. The problem with receiver regulation in the abstract is that the number of potential neighbor services is practically unlimited, so it’s impossible to construct a fully general test case for compliance. The way we can approach it from a practical perspective is to make a list of the kinds of services that might exist in a neighboring frequency or physical space and construct a test that determines a given receiver’s ability to function in that scenario. So what we’re measuring isn’t transmit power, it’s the resilience of a receiver to a particular noise environment.

It turns out digital system engineers are very good at this sort of thing. Every networking standard I contributed to from the mid-‘80s to the present has started with just such a list of potential sources of energy. This may be one of the key differences between the way that computer networks are designed and the way that the old analog radio systems of the past were designed.

The FCC’s workshops covered two full days of technical and policy discussions about how to commit this notion to regulation, and the webcasts are well worth watching if you’d like to understand how detailed this can be. You can see the webcasts here (Day One and Day Two) or you can wait for the next blog post on this issue.

Print Friendly

About the author

Richard Bennett is an ITIF Senior Research Fellow specializing in broadband networking and Internet policy. He has a 30 year background in network engineering and standards. He was vice-chair of the IEEE 802.3 task group that devised the original Ethernet over Twisted Pair standard, and has contributed to Wi-Fi standards for fifteen years. He was active in OSI, the instigator of RFC 1001, and founder, along with Bob Metcalfe, of the Open Token Foundation, the first network industry alliance to operate an interoperability lab. He has worked for leading applied research labs, where portions of his work were underwritten by DARPA. Richard is also the inventor of four networking patents and a member of the BITAG Technical Working Group.