Recently, FCC Chairman Tom Wheeler gave a speech arguing that “A 25 Mbps connection is fast becoming ‘table stakes’ in 21st century communications,” with the implication that anything less than 25 mbs is not really broadband.
This is an odd sort of statement, as it appears to be based not on any real analysis, but simply on the Chairman’s opinion. He tried to provide some rationale for this number when he stated “It’s not uncommon for a U.S. Internet connected household to have six or more connected devices – including televisions, desktops, laptops, tablets, and smartphones. When these devices are used at the same time, as they often are in the evenings, it’s not hard to overwhelm 10 Mbps of bandwidth.” I don’t know about you, but I personally am generally not using two devices at once. And as the Census Bureau reports, the average household size in the U.S. is 2.58 people with the median size being less. So, the majority of households are not overwhelming 10 Mbps of bandwidth.
So, if sub-25 Internet connections are not really broadband what does this mean in terms of what nations have
The FCC released an important new report Tuesday, Measuring Broadband America, which shows how actual broadband speeds compare to advertising claims. You can read the report and download the data the FCC collected here. The report is the result of a year of work by the FCC, its contractor Sam Knows, and a diverse group of people from the FCC, industry, universities, and public interest advocacy. The report follows a year after a quick snapshot of broadband speeds conducted during the development of the National Broadband Plan that used a different (and inferior) methodology. This report is significant because it’s both comprehensive and rigorous, as I said at the release event at Best Buy in Washington on Tuesday.
It’s also significant because the methodology was hammered out by the stakeholder group and the raw data is public, including the source code for the measurement devices. The system was developed in public, the data is public, and the code is public. There can’t be any legitimate doubt as the accuracy and reliability of the data, certainly not by people in Washington who were free to work with the
It was fun, in a wonky but not a theatrical sense, listening to Tuesday’s FCC meeting on Universal Service and Intercarrier Compensation Reform. All five Commissioners eloquently described the need for reform. The framework, which came out of the National Broadband Plan (which borrowed much from the work over the last decade by many others, including Chairmen Kennard, Powell and Martin) is commendable. It’s progress that concepts, such as using reverse auctions or setting up a cap-ex fund, once controversial, are garnering a consensus. While I haven’t read the NPRM, I know the staff filled in many of the details the Plan did not cover and made adjustments as the math merited; both to their credit. It’s a big step forward.
But like the dog in the Sherlock Holmes story whose lack of bark was the critical clue, what didn’t happen was the most interesting part. I heard what each thought was wrong but I didn’t hear how they would each make the necessary trade offs or prioritize addressing different needs. To illustrate my point, take Commissioner McDowell’s statement that we have to shrink the size of the fund. Given
While many from the FCC are headed to Vegas to see the latest, coolest devices displayed at the Consumer Electronic Show, I am heading to wintery New York for a conference of State PUC officials. I’m going to talk about the national broadband plan and universal service fund (USF) and intercarrier compensation (ICC) reform. Oddly, I think I will have more fun. I guess there is no accounting for taste.
Most of what I will say relates to what we analyzed through the planning process: that USF and ICC are the two largest revenue streams controlled by the FCC affecting the broadband ecosystem, that the logic and financial underpinnings of both are rapidly breaking down, and that neither is designed to fill the critical gaps in that ecosystem. To make matters worse, the most prevalent idea in broadband policy–that the primary metric by which a nation’s broadband policy should be judged is the speed of the wireline network to the most rural of residents—is both profoundly wrong and badly affects the way many think about USF and ICC. That idea, applied as many propose, would lead us backwards, slowing down
Susan Crawford rings in the New Year in the Yale Law and Policy Review with an article (The Looming Cable Monopoly) that illustrates a prime reason that it’s nearly impossible to have a discussion about Internet policy in the United States without a food-fight breaking out. Crawford tells us, unequivocally, that a cable company takeover of the Internet is imminent, according to no less an authority than the National Broadband Plan. The Plan doesn’t say this, of course, but Crawford twists it into a pretzel to make it seem so. First she cites some text from the National Broadband Plan that speculates about one possible broadband future and finds a dark cloud around the silver lining of higher speeds:
Analysts project that within a few years, approximately 90% of the population is likely to have access to broadband networks capable of peak download speeds in excess of 50 Mbps as cable systems upgrade to DOCSIS 3.0. About 15% of the population is likely to be able to choose between two robust high-speed service services [sic]—cable with DOCSIS 3.0 and upgraded services from telephone companies offering
After five years of bickering, the FCC passed an Open Internet Report & Order on a partisan 3-2 vote this week. The order is meant to guarantee that the Internet of the future will be just as free and open as the Internet of the past. Its success depends on how fast the Commission can transform itself from an old school telecom regulator wired to resist change into an innovation stimulator embracing opportunity. One thing we can be sure about is that the order hasn’t tamped down the hyperbole that’s fueled the fight to control the Internet’s constituent parts for all these years.
Advocates of net neutrality professed deep disappointment that the FCC’s rules weren’t more proscriptive and severe. Free Press called the order “fake net neutrality,” Public Knowledge said it “fell far short,” Media Access Project called it “inadequate and riddled with loopholes,” and New America Foundation accused the FCC of “caving to telecom lobbyists.” These were their official statements to the press; their Tweets were even harsher.
Free marketers were almost as angry: Cato denounced the order as “speech
The Washington Post’s lead gadget writer, Rob Pegoraro, graced us with the benefit of his expertise yesterday in a column on the FCC’s Open Internet order (FCC votes for a half-measure on net neutrality;) In short, he’s not happy. His FCC post is actually more closely related to the frustrations expressed in a preceding post reviewing the video calling services provided by a game controller, the Xbox Video Kinect, than he apparently realizes.
Pegoraro fails to find satisfaction with video calls over the Kinect:
The major disappointment here was the horrendous quality of the video, considering that I had about 5 million bits per second of upstream bandwidth at each end of the test. The footage looked unmistakably pixelated.
He notes that fellow gamers experience video problems as well. He might try video Skype as an alternative, but alas, that allegedly peer-to-peer service is down today due to a system-wide software problem with the “super-nodes” that run its directory service. When Skype’s on vacation and Kinect’s too pixelated to be of any use, the would-be video caller might fall back on the Cisco Umi system built on