Research | News

MIT Research Asks, Just How Fast Is U.S. Broadband?

In the race for Internet speed, the United States could definitely be called a slacker. According to data from the Organisation for Economic Co-operation and Development, in the area of average advertised broadband download speed this country falls somewhere between those little heralded hotbeds of high tech, Italy and Greece. And whereas the United States average is about 15 megabits per second, Korea offers 53 Mbps, France delivers 55 Mbps, and Japan tops out the list with a whopping 108 Mbps.

Even then, however, the advertised 15 Mbps for the United States seems dazzling compared to actual user experience. These words are being written on a system tapped into "premium" AT&T DSL that currently is delivering a download speed of about 5 Mbps. So few experts were probably surprised when the Federal Communications Commission released its National Broadband Plan, in which it reported that "the actual download speed experienced on broadband connections in American households is approximately 40-50% of the advertised 'up to' speed to which they subscribe."

However, a new study by Massachusetts Institute of Technology (MIT) researchers calls that finding into question. Their point: that the common methods for measuring Internet data rates underestimate the speed of the "access network"--that part of the Internet that Internet service providers control. Those measurements are affected, the researchers wrote, by several variables, including the number of devices accessing a home wireless network, the internal settings of a home computer, and the location of the test servers sending the computer data.

The research came out of the MIT Internet Traffic Analysis Study (MITAS), a new project at the MIT Computer Science and Artificial Intelligence laboratory (CSAIL). The project is getting input and financial support from a number of Internet Service Providers (ISPs) serving customers both in the United States and abroad. The researchers expect to publish papers, develop a traffic data repository, and hold workshops and other outreach to help others understand broadband traffic growth for such issues as traffic management, network investment, and architecture.

The researchers have submitted their first report to both the FCC and the Federal Trade Commission and will present a version of it at the Telecommunications, Policy, and Research Conference at George Mason University School of Law in Arlington, VA, in October.

"If you are doing measurements, and you want to look at data to support whatever your policy position is, these are the things that you need to be careful of," said Steve Bauer, the technical lead on MITAS. "For me, the point of the paper is to improve the understanding of the data that's informing those processes."

The researchers analyzed six different systems for measuring the speed of Internet connections, from free applications on popular Web sites to commercial software licensed by most major Internet service providers (ISPs). In each case the underestimation of the access networks' speed had a different cause. The study that the FCC relied upon, for instance, analyzed data for broadband subscribers with different "tiers of service." Subscribers paid differing fees for differing data rates. But the analysts didn't know which data corresponded to which tier of service, so they assumed that the subscription tier could be inferred from the maximum measured rate. The MITAS researchers show that, in fact, the subscribers in lower tiers sometimes ended up getting higher data rates than they had paid for. In the study cited by the FCC, exceptionally good service for a low tier may have been misclassified as exceptionally bad service for a higher tier.

In yet another case, Bauer was running a popular speed test on his own computer. Much of the time, he was getting rates close to those advertised by his ISP; but one afternoon, the rate fell dramatically. For days, the test had been pairing Bauer's computer in Cambridge with a test server in New York. But on the afternoon in question, the New York server was overburdened with other requests, so it redirected Bauer to the nearest free server it could find--in Amsterdam. The long sequence of links, including a transatlantic link, between his computer and the test server probably explains the difference in data rates, Bauer said. The congestion may have been a temporary condition.

This points, the researchers said, to the difficulty of using a single data rate to characterize a broadband network's performance. "If you're watching lots of movies, you're concerned about how much data you can transfer in a month and that your connection goes fast enough to keep up with the movie for a couple hours," said William Lehr, an economist who is part of the research effort. "If you're playing a game, you care about transferring small amounts of traffic very quickly. Those two kinds of users need different ways of measuring a network."

The FCC hasn't yet uttered its last word regarding average broadband currently being delivered in the United States. The federal agency is currently working on a new study to measure broadband speeds in 10,000 homes, using dedicated hardware that bypasses problems such as TCP settings or the limited capacity of home wireless networks. The agency's hope, according to a spokesman, is to obtain better data by which to formulate policy and plans.

The full MIT report can be downloaded as a PDF here.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Whitepapers