Federal Communications Commission FCC 20-188
Federal Communications Commission FCC 20-188
APPENDIX D
Ninth
Measuring Broadband America
Fixed Broadband Report
A Report on Consumer Fixed Broadband Performance
in the United States
TABLE OF CONTENTS
2
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
List of Charts
Chart 1: Weighted average advertised download speed among the top 80% service tiers offered by each
ISP ............................................................................................................................................ 11
Chart 2: Weighted average advertised download speed among the top 80% service tiers based on
technology. .............................................................................................................................. 12
Chart 3: Consumer migration to higher advertised download speeds ....................................................... 13
Chart 4: The ratio of weighted median speed (download and upload) to advertised speed for each ISP.
Note Verizon advertises a speed range for both its download and upload DSL tier and hence
appears as a range in this and other charts. ........................................................................... 14
Chart 5: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed....................................... 15
Chart 6: The ratio of 80/80 consistent median download speed to advertised download speed. ............ 16
Chart 7: Latency by ISP ................................................................................................................................ 17
Chart 8: Percentage of consumers whose peak-period packet loss was less than 0.4%, between 0.4% to
1%, and greater than 1%. ........................................................................................................ 18
Chart 9: Average webpage download time, by advertised download speed. ............................................ 19
Chart 10: Weighted average advertised upload speed among the top 80% service tiers offered by each
ISP. ........................................................................................................................................... 25
Chart 11: Weighted average advertised upload speed among the top 80% service tiers based on
technology. .............................................................................................................................. 26
Chart 12.1: The ratio of median download speed to advertised download speed. ................................... 26
Chart 12.2: The ratio of median upload speed to advertised upload speed. ............................................. 27
Chart 13: The percentage of consumers whose median upload speed was (a) greater than 95%, (b)
between 80% and 95%, or (c) less than 80% of the advertised upload speed. ...................... 28
Chart 14.1: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed. ..................................................................................................................... 29
Chart 14.2: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed (continued). ................................................................................................. 29
Chart 14.3: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed, by technology. ............................................................................................ 30
Chart 14.4: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed. .......................................................................................................................... 31
Chart 14.5: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed (continued). ...................................................................................................... 31
3
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 14.6: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed, by technology. ................................................................................................. 32
Chart 15.1: The ratio of weighted median download speed to advertised download speed, peak hours
versus off-peak hours. ............................................................................................................. 32
Chart 15.2: The ratio of weighted median upload speed to advertised upload speed, peak versus off-peak.
................................................................................................................................................. 33
Chart 16: The ratio of median download speed to advertised download speed, Monday-to-Friday, two-
hour time blocks. ..................................................................................................................... 34
Chart 17.1: The ratio of 80/80 consistent upload speed to advertised upload speed. .............................. 35
Chart 17.2: The ratio of 70/70 consistent download speed to advertised download speed. .................... 35
Chart 17.3: The ratio of 70/70 consistent upload speed to advertised upload speed. .............................. 36
Chart 18: Latency for Terrestrial ISPs, by technology, and by advertised download speed. ..................... 36
Chart 19.1: The ratio of median download speed to advertised download speed, by ISP (0-5 Mbps). ..... 37
Chart 19.2: The ratio of median download speed to advertised download speed, by ISP (6-10 Mbps). ... 38
Chart 19.3: The ratio of median download speed to advertised download speed, by ISP (12-20 Mbps). . 38
Chart 19.4: The ratio of median download speed to advertised download speed, by ISP (25-30 Mbps). . 39
Chart 19.5: The ratio of median download speed to advertised download speed, by ISP (40-50 Mbps). . 39
Chart 19.6: The ratio of median download speed to advertised download speed, by ISP (60-75 Mbps). . 40
Chart 19.7: The ratio of median download speed to advertised download speed, by ISP (100-150 Mbps).
................................................................................................................................................. 40
Chart 19.8: The ratio of median download speed to advertised download speed, by ISP (200-300 Mbps).
................................................................................................................................................. 41
Chart 20.1: The ratio of median upload speed to advertised upload speed, by ISP (0.384 - 0.768 Mbps).
................................................................................................................................................. 41
Chart 20.2: The ratio of median upload speed to advertised upload speed, by ISP (0.896 – 1.5 Mbps). .. 42
Chart 20.3: The ratio of median upload speed to advertised upload speed, by ISP (2-5 Mbps). ............... 42
Chart 20.4: The ratio of median upload speed to advertised upload speed, by ISP (10 - 20 Mbps). ......... 43
Chart 20.5: The ratio of median upload speed to advertised upload speed, by ISP (30 - 75 Mbps). ......... 43
Chart 20.6: The ratio of median upload speed to advertised upload speed, by ISP (100-150 Mbps). ....... 44
Chart 21.1: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed, by service tier (DSL). .... 47
Chart 21.2: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed (cable). .......................... 48
4
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 21.3: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed (fiber). ........................... 49
Chart 22.1: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (DSL). ................................. 50
Chart 22.2: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (cable). ............................... 51
Chart 22.3: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (fiber). ................................ 52
Chart 23.1: Average webpage download time, by ISP (1-5 Mbps). ............................................................ 54
Chart 23.2: Average webpage download time, by ISP (6-10 Mbps), .......................................................... 55
Chart 23.3: Average webpage download time, by ISP (12-20 Mbps). ........................................................ 55
Chart 23.4: Average webpage download time, by ISP (25-30 Mbps). ........................................................ 56
Chart 23.5: Average webpage download time, by ISP (40-50 Mbps). ........................................................ 56
Chart 23.6: Average webpage download time, by ISP (60-75 Mbps). ........................................................ 57
Chart 23.7: Average webpage download time, by ISP (100-150 Mbps). .................................................... 57
Chart 23.8: Average webpage download time, by ISP (200-300 Mbps). .................................................... 58
List of Tables
Table 1: The most popular advertised service tiers .................................................................................... 10
Table 2: Peak Period Median download speed, by ISP ............................................................................... 48
Table 3: Complementary cumulative distribution of the ratio of median download speed to
advertised download speed by ISP .............................................................................................. 55
Table 4: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed by ISP ..................................................................................................................... 56
5
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
1. Executive Summary
The Ninth Measuring Broadband America Fixed Broadband Report (“Ninth Report” or “Report”) contains
validated data collected in September and October 20181 from fixed Internet Service Providers (ISPs) as
part of the Federal Communication Commission’s (FCC) Measuring Broadband America (MBA) program.
This program is an ongoing, rigorous, nationwide study of consumer broadband performance in the
United States. The goal of this program is to measure the network performance delivered on selected
service tiers to a representative sample set of the population. Thousands of volunteer panelists are drawn
from subscribers of Internet Service Providers serving over 80% of the residential marketplace2.
The initial Measuring Broadband America Fixed Broadband Report was published in August 2011,3 and
presented the first broad-scale study of directly measured consumer broadband performance throughout
the United States. As part of an open data program, all methodologies used in the program are fully
documented, and all data collected is published for public use without any restrictions. Including this
current Report, nine reports have now been issued.4 These reports provide a snapshot of fixed broadband
Internet access service performance in the United States. These reports present analysis of broadband
information in a variety of ways and have evolved to make the information more understandable and
useful, as well as, to reflect the evolving applications supported by the nation’s broadband infrastructure.
C. MAJOR FINDINGS OF THE NINTH REPORT
The key findings of this report are:
• The maximum advertised download speeds amongst the service tiers offered by ISPs and measured
by the FCC ranged from 24 Mbps to 1 Gbps for the period covered by this report.
• The weighted average advertised speed of the participating ISPs was 123.3 Mbps, representing a 96%
increase from the previous year.
• For most of the major broadband providers that were tested, measured download speeds were 100%
or better than advertised speeds during the peak hours (7 p.m. to 11 p.m. local time).
1
The actual dates used for measurements for this Ninth Report were September 25 – October 25, 2018 (inclusive).
2
At the request of and with the assistance of the State of Hawaii Department of Commerce and Consumer Affairs
(DCCA) the state of Hawaii was added to the MBA program in 2017. The ISPs whose performance were measured in
the State of Hawaii were Hawaiian Telcom and Oceanic Time Warner Cable (which is now a part of Charter
Spectrum).
3
All reports can be found at [Link]
4
The First Report (2011) was based on measurements taken in March 2011, the Second Report (2012) on
measurements taken in April 2012, and the Third (2013) through Eighth (2018) Reports on measurements taken in
September of the year prior to the reports’ release dates. In order to avoid confusion between the date of release
of the report and the measurement dates we have shifted last year to numbering the reports. Thus, this year’s
report is termed the Ninth MBA Report instead of the 2019 MBA Report. Going forward we will continue with a
numbered approach and the next report will be termed as the Tenth Report.
6
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
• Eleven ISPs were evaluated in this report. Of these AT&T, Cincinnati Bell, Frontier and Verizon
employed multiple different broadband technologies across the USA. Overall 14 different
ISP/technology configurations were evaluated in this report and ten performed at or better than their
advertised speed and only one performed below 90% for actual-to-advertised download speed.
• In addition to providing download and upload speed measurements of ISPs, this report also provides
a measure of consistency of advertised speeds of ISPs with the use of our “80/80” metric. The 80/80
metric measures the percentage of the advertised speed that at least 80% of subscribers experience
at least 80% of the time over peak periods. Ten of the 14 ISP/technologies configurations provide
better than 70% of advertised speed to at least 80% of panelists for at least 80% of the time.
These and other findings are described in greater detail within this report.
D. SPEED PERFORMANCE METRICS
Speed (both download and upload) performance continues to be one of the key metrics reported by the
MBA. The data presented includes ISP broadband performance as a median5 of speeds experienced by
panelists within a specific service tier. These reports mainly focus on common service tiers used by an
ISP’s subscribers.6
Additionally, consistent with previous Reports, we also compute ISP performance by weighting the
median speed for each service tier by the number of subscribers in that tier. Similarly, in calculating the
overall average speed of all ISPs in a specific year, the median speed of each ISP is used and weighted by
the number of subscribers of that ISP as a fraction of the total number of subscribers across all ISPs.
In calculating these weighted medians, we have drawn on two sources for determining the number of
subscribers per service tier. ISPs may voluntarily contribute their data per surveyed service tier as the
most recent and authoritative data. Many ISPs have chosen to do so.7 When such information has not
been provided by an ISP, we instead rely on the FCC’s Form 477 data.8 All facilities-based broadband
providers are required to file data with the FCC twice a year (Form 477) regarding deployment of
5
We first determine the mean value over all the measurements for each individual panelist’s “whitebox.” (Panelists
are sent “whiteboxes” that run pre-installed software on off-the-shelf routers that measure thirteen broadband
performance metrics, including download speed, upload speed, and latency.) Then for each ISP’s speed tiers, we
choose the median of the set of mean values for all the panelists/whiteboxes. The median is that value separating
the top half of values in a sample set with the lower half of values in that set; it can be thought of as the middle (i.e.,
most typical) value in an ordered list of values. For calculations involving multiple speed tiers, we compute the
weighted average of the medians for each tier. The weightings are based on the relative subscriber numbers for the
individual tiers.
6
Only tiers that contribute to the top 80% of an ISPs total subscribership are included in this report.
7
The ISPs that provided SamKnows, the FCC’s contractor supporting the MBA program, with weights for each of
their tiers were: Cincinnati Bell, CenturyLink, Charter, Comcast, Cox Frontier, Hawaiian Telcom, Optimum, and
Verizon.
8
For an explanation of Form 477 filing requirements and required data see:
[Link] (Last accessed 5/2/2018).
7
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
broadband services, including subscriber counts. For this report, we used the June 2018 Form 477 data.
It should be noted that the Form 477 subscriber data values are for a month that generally lags the
reporting month, and therefore, there are likely to be small inaccuracies in the tier ratios. It is for this
reason that we encourage ISPs to provide us with subscriber numbers for the measurement month.
As in our previous reports, we found that for most ISPs the actual speeds experienced by subscribers
either nearly met or exceeded advertised service tier speeds. However, since we started our MBA
program, consumers have changed their Internet usage habits. In 2011, consumers mainly browsed the
web and downloaded files; thus, we reported average broadband speeds since these average speeds were
likely to closely mirror user satisfaction. By contrast, in September-October 2018 (the measurement
period for this report) consumer internet usage had become dominated by video consumption, with
consumers regularly streaming video for entertainment and education.9 Both the median measured
speed and consistency in service are likely to influence the perception and usefulness of Internet access
service. Therefore, our network performance analytics have been expanded to better capture this.
Specifically, we use two kinds of metrics to reflect the consistency of service delivered to the consumer:
First, we report the percentage of advertised speed experienced by at least 80% of panelists during at
least 80% of the daily peak usage period (“80/80 consistent speed” measure). Second, we show the
fraction of consumers who obtain median speeds greater than 95%, between 80% and 95%, and less than
80% of advertised speeds.
E. USE OF OTHER PERFORMANCE METRICS
Although download and upload speeds remain the network performance metric of greatest interest to
the consumer, we also spotlight two other key network performance metrics in this report: latency and
packet loss. These metrics can significantly affect the overall quality of Internet applications.
Latency is the time it takes for a data packet to travel across a network from one point on the network to
another. High latencies may affect the perceived quality of some interactive services such as phone calls
over the Internet, video chat and video conferencing, or online multiplayer games. All network access
technologies have a minimum latency that is largely determined by the technology. In addition, network
congestion will lead to an increase in measured latency. Technology-dependent latencies are typically
small for terrestrial broadband services and are thus unlikely to affect the perceived quality of
applications. Additionally, for certain applications the user experience is not necessarily affected by high
latencies. As an example, when using entertainment video streaming applications, because the data can
be cached prior to display, the user experience is likely to be unaffected by relatively high latencies.
Packet loss measures the fraction of data packets sent that fail to be delivered to the intended destination.
Packet loss may affect the perceived quality of applications that do not request retransmission of lost
packets, such as phone calls over the Internet, video chat, some online multiplayer games, and some video
streaming. High packet loss also degrades the achievable throughput of download and streaming
applications. However, packet loss of a few tenths of a percent are unlikely to significantly affect the
9
The sum of all forms of IP video, which includes Internet video, IP video-on-demand (VoD), video files exchanged
through file sharing, video-streamed gaming, and video conferencing, will continue to be in the range of 80 to 90
percent of total IP traffic. Globally, IP video traffic will account for 82 percent of traffic by 2022. See Cisco Visual
Networking Index: Forecast and Methodology, 2017-2022 White Paper,
[Link]
[Link] (Last accessed Dec. 12, 2019).
8
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
perceived quality of most Internet applications and are common. During network congestion, both
latency and packet loss typically increase.
The Internet is continuing to evolve in its architectures, performances, and services. Accordingly, we will
continue to adapt our measurement and analysis methodologies to help consumers understand the
performance characteristics of their broadband Internet access service, and thus make informed choices
about their use of such services.
9
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
*Tiers that lack sufficient panelists to meet the program’s target sample size.
** Although Verizon Fiber’s 940/880 Mbps service tier was amongst the top 80% of Verizon’s offered
tiers by subscription numbers, it is not included in the report charts because technical procedures for
measuring high speed rates near Gigabit and above have not yet been established for the MBA program.
Chart 1 (below) displays the weighted (by subscriber numbers) mean of the top 80% advertised download
speed tiers for each participating ISP for September-October 2018 as well as September 2017, grouped
by the access technology used to offer the broadband Internet access service (DSL, cable or fiber). In
September-October 2018, the weighted average advertised download speed was 123.3 Mbps among the
measured ISPs, which represents an 96% increase compared to the average in September 2017 which was
62.9Mbps.
10
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 1: Weighted average advertised download speed among the top 80% service tiers offered by each
ISP
Among participating broadband ISPs, only AT&T IPBB10, Cincinnati Bell, Hawaiian Telecom fiber, Frontier,
and Verizon use fiber as the access technology for a substantial number of their customers and their
maximum speed offerings range from 150 Mbps to 1 Gbps. A key difference between the fiber vendors
and other technology vendors is that (with the exception of Cincinnati Bell), most fiber vendors advertise
generally symmetric upload and download speeds. This is in sharp contrast to the asymmetric offerings
for all the other technologies, where the upload advertised speeds are typically 5 to 10 times below the
download advertised speeds.
It should be noted that there is also considerable difference between the offered average weighted speed
tier by technology. Chart 2 plots the weighted average of the top 80% ISP tiers by technology both for
September 2017 as well as September-October 2018. As can be seen in this chart, all technologies showed
increases in the set of advertised download speeds by ISPs. For the September-October 2018 period, the
weighted mean advertised speeds for DSL technology was 50 Mbps which lagged considerably behind the
weighted mean advertised download speeds for cable and fiber technologies, which were 139 Mbps and
251 Mbps respectively. Fiber technology showed the greatest increase in speed offerings in 2018
compared to 2017 with a weighted mean going up from 70 Mbps to 251 Mbps representing a 258%
increase. In comparison, DSL and cable technologies showed 96%, and 64% increase from 2017 to 2018.
10
Although AT&T IPBB has been characterized here as a DSL technology it actually includes a mix of ADSL2+, VDSL2,
[Link] and Ethernet technologies delivered over a hybrid of fiber optic and copper facilities.
11
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 2: Weighted average advertised download speed among the top 80% service tiers based on
technology.
Chart 3 plots the migration of panelists to a higher service tier based on their access technology.11
Specifically, the horizontal axis of Chart 3 partitions the September 2017 panelists by the advertised
download speed of the service tier to which they were subscribed. For each such set of panelists who
also participated in the September-October 2018 collection of data,12 the vertical axis of Chart 3 displays
the percentage of panelists that migrated by September-October 2018 to a service tier with a higher
advertised download speed. There are two ways that such a migration could occur: (1) if a panelist
changed their broadband plan during the intervening year to a service tier with a higher advertised
download speed, or (2) if a panelist did not change their broadband plan but the panelist’s ISP increased
the advertised download speed of the panelist’s subscribed plan.13
Chart 3 shows that the percentage of panelists subscribed in September 2017 who moved to higher tiers
in September-October 2018 was between 3% to 67% for DSL subscribers, 22% to 100% for cable
11
Where several technologies are plotted at the same point in the chart, this is identified as “Multiple Technologies.”
12
Of the 4,545 panelists who participated in the September 2017 collection of data, 4,355 panelists continued to
participate in the September-October 2018 collection of data.
13
We do not attempt here to distinguish between these two cases.
12
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
subscribers and 8% to 80% for fiber subscribers. In addition, 1% to 13% subscribers migrated to a higher
speed tier using a different technology from what they had in September 2017.
13
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 4: The ratio of weighted median speed (download and upload) to advertised speed for each ISP. Note
Verizon advertises a speed range for both its download and upload DSL tier and hence appears as
a range in this and other charts.
C. VARIATIONS IN SPEEDS
As discussed earlier, actual speeds experienced by individual consumers may vary by location and time of
day. Chart 5 shows, for each ISP, the percentage of panelists who experienced a median download speed
(averaged over the peak usage period during our measurement period) that was greater than 95%,
between 80% and 95%, or less than 80% of the advertised download speed.
14
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 5: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed
ISPs using DSL technology had between 2% to 73% of their subscribers getting greater than or equal to
95% of their advertised download speeds during peak hours. ISPs using cable technology and fiber
technology had between 79%-94% and between 69%-98%, respectively, of their subscribers getting equal
to or better than 95% of their advertised download speeds.
Though the median download speeds experienced by most ISPs’ subscribers nearly met or exceeded the
advertised download speeds, there are some customers of each ISP for whom the median download
speed fell significantly short of the advertised download speed. Relatively few subscribers of cable or
fiber broadband service experienced this. The best performing ISPs, when measured by this metric, are
Charter, Comcast, Cox, Mediacom, Frontier-Fiber and Verizon-Fiber; more than 80% of their panelists
were able to attain an actual median download speed of at least 95% of the advertised download speed.
In addition to variations based on a subscriber’s location, speeds experienced by a consumer may
fluctuate during the day. This is typically caused by increased traffic demand and the resulting stress on
different parts of the network infrastructure. To examine this aspect of performance, we use the term
“80/80 consistent speed.” This metric is designed to assess temporal and spatial variations in measured
values of a user’s download speed.14 While consistency of speed is in itself an intrinsically valuable service
characteristic, its impact on consumers will hinge on variations in usage patterns and needs. As an
example, a good consistency of speed measure is likely to indicate a higher quality of service experience
for internet users consuming video content.
Chart 6 summarizes, for each ISP, the ratio of 80/80 consistent median download speed to advertised
download speed, and, for comparison, the ratio of median download speed to advertised download speed
14
For a detailed definition and discussion of this metric, please refer to the Technical Appendix.
15
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
shown previously in Chart 4. The ratio of 80/80 consistent median download speed to advertised
download speed is less than the ratio of median download speed to advertised download speed for all
participating ISPs due to congestion periods when median download speeds are lower than the overall
average. When the difference between the two ratios is small, the median download speed is fairly
insensitive to both geography and time. When the difference between the two ratios is large, there is a
greater variability in median download speed, either across a set of different locations or across different
times during the peak usage period at the same location.
Chart 6: The ratio of 80/80 consistent median download speed to advertised download speed.
Customers of Charter, Comcast, Cox, Mediacom, Optimum, Frontier Fiber and Verizon Fiber (FiOS)
experienced median download speeds that were very consistent; i.e., they provided greater than 90% of
the advertised speed during peak usage period to more than 80% of panelists for more than 80% of the
time. As can be seen in Chart 6, except for AT&T-IPBB, cable and fiber ISPs performed better than DSL
ISPs with respect to their 80/80 consistent speeds. For example, for September-October 2018, the 80/80
consistent download speed for Cincinnati Bell DSL was 54% of the advertised speed.
D. LATENCY
Latency is the time it takes for a data packet to travel from one point to another in a network. It has a
fixed component that depends on the distance, the transmission speed, and transmission technology
between the source and destination, and a variable component that increases as the network path
congests with traffic. The MBA program measures latency by measuring the round-trip time from the
consumer’s home to the closest measurement server and back.
Chart 7 shows the median latency for each participating ISP. In general, higher-speed service tiers have
lower latency, as it takes less time to transmit each packet. The median latencies ranged from 9.5 ms to
36 ms in our measurements (with the exception of Verizon DSL which had a median latency of 42 ms).
16
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
DSL latencies (between 24 ms to 42 ms) were slightly higher than those for cable (15 ms to 27 ms). Fiber
ISPs showed the lowest latencies (10 ms to 15 ms). The differences in median latencies among terrestrial-
based broadband services are relatively small and are unlikely to affect the perceived quality of highly
interactive applications.
E. PACKET LOSS
Packet loss is the percentage of packets that are sent by a source but not received at the intended
destination. The most common causes of packet loss are high latency or encountered congestion along
the network route. A small amount of packet loss is expected, and indeed packet loss is commonly used
by some Internet protocols to infer Internet congestion and to adjust the sending rate to mitigate for the
congestion. The MBA program considers a packet lost if the packet’s round-trip latency exceeds 3
seconds.
Chart 8 shows the average peak-period packet loss for each participating ISP, grouped into bins. We have
broken the packet loss performance into three bands, allowing a more granular view of the packet loss
performance of the ISP network. The breakpoints for the three bins used to classify packet loss have been
chosen with an eye towards balancing commonly accepted packet loss standards and provider packet loss
Service Level Agreements (SLAs). Specifically, the 1% standard for packet loss is commonly accepted as
the point at which highly interactive applications such as VoIP experience significant degradation and
quality according to international documents.15 The 0.4% breakpoint was chosen as a generic breakpoint
between the highly desired performance of 0% packet loss described in many documents and the 1%
unacceptable limit on the high side. The specific value of 0.4% is based upon a compromised value
between those two limits and is generally supported by many SLAs and major ISPs for network
performance. Indeed, most SLAs support 0.1% to 0.3% SLA packet loss guarantees,16 but these are
generally for enterprise level services which generally have more stringent requirements for higher-level
performance.
15
See: [Link] and [Link]
16
See: [Link]
17
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 8: Percentage of consumers whose peak-period packet loss was less than 0.4%, between 0.4% to
1%, and greater than 1%.
Chart 8 shows that ISPs using fiber technology have the lowest packet loss, and that ISPs using DSL
technology tends to have the highest packet loss. Within a given technology class, packet loss also varies
among ISPs.
F. WEB BROWSING PERFORMANCE
The MBA program also conducts a specific test to gauge web browsing performance. The web browsing
test accesses nine popular websites that include text and images, but not streaming video. The time
required to download a webpage depends on many factors, including the consumer’s in-home network,
the download speed within an ISP’s network, the web server’s speed, congestion in other networks
outside the consumer’s ISP’s network (if any), and the time required to look up the network address of
the webserver. Only some of these factors are under control of the consumer’s ISP. Chart 9 displays the
average webpage download time as a function of the advertised download speed. As shown by this chart,
webpage download time decreases as download speed increases, from about 9.3 seconds at 1.5 Mbps
download speed to about 1.4-1.7 seconds for 30 Mbps download speed. Subscribers to service tiers
exceeding 25 Mbps experience slightly smaller webpage download times decreasing to 1.1 second at 300
Mbps. These download times assume that only a single user is using the Internet connection when the
webpage is downloaded, and does not account for more common scenarios, where multiple users within
a household are simultaneously using the Internet connection for viewing web pages, as well as other
applications such as real-time gaming or video streaming.
18
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
19
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
3. Methodology
A. PARTICIPANTS
Eleven ISPs participated in the Fixed MBA program in September-October 2018.17 They were:
• CenturyLink
• Charter Communications
• Cincinnati Bell
• Comcast
• Cox Communications
• Frontier Communications Company
• Hawaiian Telcom
• Mediacom Communications Corporation
• Optimum
• Verizon
• Windstream Communications
The methodologies and assumptions underlying the measurements described in this Report are reviewed
at meetings that are open to all interested parties and documented in public ex parte letters filed in the
GN Docket No. 12-264. Policy decisions regarding the MBA program were discussed at these meetings
prior to adoption, and involved issues such as inclusion of tiers, test periods, mitigation of operational
issues affecting the measurement infrastructure, and terms-of-use notifications to panelists. Participation
in the MBA program is open and voluntary. Participants include members of academia, consumer
equipment vendors, telecommunications vendors, network service providers, consumer policy groups as
well as our contractor for this project, SamKnows. In 2018-2019, participants at these meetings
(collectively and informally referred to as “the broadband collaborative”), included all eleven participating
ISPs and the following additional organizations:
• Level 3 Communications (“Level 3”), now part of CenturyLink
• Massachusetts Institute of Technology (“MIT”)
• Measurement Lab (M-Lab)
• NCTA – The Internet & Television Association (“NCTA”)
• New America Foundation
• Princeton University
• United States Telecom Association (“US Telecom”)
• University of California - Santa Cruz
17
Both AT&T and Hughes Network Systems left the program as participating ISPs this year, bringing the total number
of participating ISPs to eleven. We continued to evaluate AT&T’s sets of tiers with sufficient numbers of panelists
despite the fact that AT&T did not participate this year, so the total number of ISPs evaluated in this report was
twelve. As of the Eighth Report (previous year’s report), Viasat, operating under the brand name Exede internet,
left the program as a participating ISP the prior year and consequently no longer provides panelists with an increased
data allowance to offset the data used by the MBA measurements. We, however, continue reporting raw data results
for ViaSat/Exede and Hughes Network Systems tiers by using lightweight tests aimed at reducing the data burden
on these panelists. These tests are described in greater detail in the accompanying Technical Appendix to this Ninth
MBA Report.
20
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Participants have contributed in important ways to the integrity of this program and have provided
valuable input to FCC decisions for this program. Initial proposals for test metrics and testing platforms
were discussed and critiqued within the broadband collaborative. M-Lab and Level 3 contributed their
core network testing infrastructure, and both parties continue to provide invaluable assistance in helping
to define and implement the FCC testing platform. We thank all the participants for their continued
contributions to the MBA program.
B. MEASUREMENT PROCESS
The measurements that provided the underlying data for this report were conducted between MBA
measurement clients and MBA measurement servers. The measurement clients (i.e., whiteboxes) were
situated in the homes of 5,855 panelists each of whom received service from one of the 12 evaluated ISPs.
The evaluated ISPs collectively accounted for over 80% of U.S. residential broadband Internet
connections. After the measurement data was processed (as described in greater detail in the Technical
Appendix), test results from 3,192 panelists were used in this report.
The measurement servers used by the MBA program were hosted by M-Lab and Level 3 Communications,
and were located in eleven cities (often with multiple locations within each city) across the United States
near a point of interconnection between the ISP’s network and the network on which the measurement
server resided.
The measurement clients collected data throughout the year, and this data is available as described
below. However, only data collected from September 25 through October 25, 2018, referred to
throughout this report as the “September-October 2018” reporting period, were used to generate the
charts in this Report.18
Broadband performance varies with the time of day. At peak hours, more people tend to use their
broadband Internet connections, giving rise to a greater potential for network congestion and degraded
user performance. Unless otherwise stated, this Report focuses on performance during peak usage
period, which is defined as weeknights between 7:00 p.m. to 11:00 p.m. local time at the subscriber’s
location. Focusing on peak usage period provides the most useful information because it demonstrates
what performance users can expect when the Internet in their local area experiences the highest demand
from users.
Our methodology focuses on the network performance of each of the participating ISPs. The metrics
discussed in this Report are derived from active measurements, i.e., test-generated traffic flowing
between a measurement client, located within the modem/router within a panelist’s home, and a
measurement server, located outside the ISP’s network. For each panelist, the tests automatically choose
the measurement server that has the lowest latency to the measurement client. Thus, the metrics
measure performance along the path followed by the measurement traffic within each ISP’s network,
through a point of interconnection between the ISP’s network and the network on which the chosen
18
This proposed time period avoids the dates in early September when parts of North Carolina and Florida were
affected by Hurricanes Florence and Michael. It also avoided the increased traffic resulting from latest iOS release
which also took place in early September. Omitting dates during these periods was done consistent with the FCC’s
data collection policy for fixed MBA data. See FCC, Measuring Fixed Broadband, Data Collection Policy,
[Link] (explaining that the FCC
has developed policies to deal with impairments in the data collection process with potential impact for the
validity of the data collected).
21
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
measurement server is located. However, the service performance that a consumer experiences could
differ from our measured values for several reasons.
First, as noted, in the course of each test instance we measure performance only to a single measurement
server rather than to multiple servers. This is consistent with the approach chosen by most network
measurement tools. As a point of comparison, the average web page may load its content from a
multiplicity of end points.
In addition, bottlenecks or congestion points in the full path traversed by consumer application traffic
might also impact a consumer’s perception of Internet service performance. These bottlenecks may exist
at various points: within the ISP’s network, beyond its network (depending on the network topology
encountered en route to the traffic destination), in the consumer’s home, on the Wi-Fi used to access the
in-home access router, or from a shortfall of capacity at the far end point being accessed by the
application. The MBA tests explore how a service performs from the point at which a fixed ISP’s Internet
service is delivered to the home on fixed infrastructure (deliberately excluding Wi-Fi, due to the many
confounding factors associated with it) to the point at which the test servers are located. As MBA tests
are designed to focus on the access to the ISP’s network, they will not include phenomena at most
interconnection points or transit networks that consumer traffic may traverse.
To the extent possible19 the MBA focuses on performance within an ISP’s network. It should be noted
that the overall performance a consumer experiences with their service can also be affected by congestion
such as may arise at other points in the path potentially taken by consumer traffic (e.g., in-home Wi-Fi,
peering points, transit networks etc.) but this does not get reflected in MBA measurements.
A consumer’s home network, rather than the ISP’s network, may be the bottleneck with respect to
network congestion. We measure the performance of the ISP’s service delivered to the consumer’s home
network, but this service is often shared simultaneously among multiple users and applications within the
home. In-home networks, which typically include Wi-Fi, may not have sufficient capacities to support
peak loads.20
In addition, consumers’ experience of ISP performance is manifested through the set of applications they
utilize. The overall performance of an application depends not only on the network performance (i.e.,
raw speed, latency or packet loss) but also on the application’s architecture and implementation and on
the operating system and hardware on which it runs. While network performance is considered in this
Report, application performance is generally not.
19
The MBA program uses test servers that are both neutral (i.e., operated by third parties that are not ISP-operated
or owned) and located as close as practical, in terms of network topology, to the boundaries of the ISP networks
under study. As described earlier in this section, a maximum of two interconnection points and one transit network
may be on the test path. If there is congestion on such paths to the test server, it may impact the measurement,
but the cases where it does so are detectable by the test approach followed by the MBA program, which uses
consistent longitudinal measurements and comparisons with averaged results. Details of the methodology used in
the MBA program are given in the Technical Appendix to this report.
20
Independent research, drawing on the FCC’s MBA test platform [numerous instances of research supported by the
fixed MBA test platform are described at [Link] suggests that
home networks are a significant source of end-to-end service congestion. See Srikanth Sundaresan et al., Home
Network or Access Link? Locating Last-Mile Downstream Throughput Bottlenecks, PAM 2016 - Passive and Active
Measurement Conference, at 111-123, March 2016.
22
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
21
The September-October 2018 data set was validated to remove anomalies that would have produced errors in the
Report. This data validation process is described in the Technical Appendix.
23
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Both the Commission and SamKnows, the Commission’s contractor for this program, recognize that, while
the methodology descriptions included in this document provide an overview of the project, interested
parties may be willing to contribute to the project by reviewing the software used in the testing.
SamKnows welcomes review of its software and technical platform, consistent with the Commission’s
goals of openness and transparency for this program.22
22
The software that was used for the MBA program will be made available for noncommercial purposes. To apply
for noncommercial review of the code, interested parties may contact SamKnows directly at team@[Link],
with the subject heading “Academic Code Review.”
24
Ninth Measuring Broadband America Fixed Broadband Report
4. Test Results
A. MOST POPULAR ADVERTISED SERVICE TIERS
Chart 1 above summarizes the weighted average of the advertised download speeds23 for each
participating ISP, for September-October 2018 and September 2017 where the weighting is based upon
the number of subscribers to each tier, grouped by the access technology used to offer the broadband
Internet access service (DSL, cable, or fiber). Only the top 80% tiers (by subscriber number) of each ISP
were included. Chart 10 below shows the corresponding weighted average of the advertised upload
speeds among the measured ISPs. The computed weighted average of the advertised upload speed of all
the ISPs is 27 Mbps representing a 141% increase over the previous year’s value of 11 Mbps.
Chart 10: Weighted average advertised upload speed among the top 80% service tiers offered by each ISP.
Chart 11 compares the weighted average of the advertised upload speeds by technology both for
September 2017 and September-October 2018. As can be seen in this chart, all technologies showed
increased rates in 2018 as compared to 2017. However, the rates of increase were not the same for all
technologies. The rate of increase in the weighted average of Fiber technology was 308% compared to
DSL and Cable which were 22% and 31 % respectively.
Observing both the download and upload speeds, it is clear that fiber service tiers are generally symmetric
in their actual upload and download speeds. This results from the fact that fiber technology has
significantly more capacity than other technologies and it can be engineered to have symmetric upload
and download speeds. For other technologies with more limited capacity, higher capacity is usually
allocated to download speeds than to upload speeds, typically in ratios ranging from 5:1 to 10:1. This
resulting asymmetry in download/upload speeds is reflective of actual usage because consumers typically
download significantly more data than they upload.
23
Measured service tiers were tiers which constituted the top 80% of an ISP’s broadband subscriber base.
25
Ninth Measuring Broadband America Fixed Broadband Report
Chart 11: Weighted average advertised upload speed among the top 80% service tiers based on
technology.
24
In these charts, we show Verizon’s median speed as a percentage of the mid-point between their lower and upper
advertised speed range.
26
Ninth Measuring Broadband America Fixed Broadband Report
Chart 12.2 shows the median upload speed as a percentage of the advertised speed. As was the case with
download speeds most ISPs met or exceeded the advertised rates except for a number of DSL providers:
CenturyLink, Cincinnati Bell DSL, Frontier DSL, Verizon DSL and Windstream which had respective ratios
of 88%, 85%, 96%, 91%, and 78%.
Chart 12.2: The ratio of median upload speed to advertised upload speed.
C. VARIATIONS IN SPEEDS
Median speeds experienced by consumers may vary based on location and time of day. Chart 5 above
showed, for each ISP, the percentage of consumers (across the ISP’s service territory) who experienced a
median download speed over the peak usage period that was either greater than 95%, between 80% and
95%, or less than 80% of the advertised download speed. Chart 13 below shows the corresponding
percentage of consumers whose median upload speed fell in each of these ranges. With the exception of
AT&T IPBB, ISPs using DSL technology had between 20% and 49% of their subscribers getting greater than
or equal to 95% of their advertised upload speeds during peak hoursISPs using cable or fiber technology
had between 90% - 99% of their subscribers getting equal to or better than 95% of their advertised upload
speeds.
27
Ninth Measuring Broadband America Fixed Broadband Report
Chart 13: The percentage of consumers whose median upload speed was (a) greater than 95%, (b) between
80% and 95%, or (c) less than 80% of the advertised upload speed.
Though the median upload speeds experienced by most subscribers were close to or exceeded the
advertised upload speeds there were some subscribers, for each ISP, whose median upload speed fell
significantly short of the advertised upload speed. This issue was most prevalent for ISPs using DSL
technology. On the other hand, ISPs using cable and fiber technology generally showed very good
consistency based on this metric.
We can learn more about the variation in network performance by separately examining variations across
geography and across time. We start by examining the variation across geography within each
participating ISP’s service territory. For each ISP, we first calculate the ratio of the median download
speed (over the peak usage period) to the advertised download speed for each panelist subscribing to
that ISP. We then examine the distribution of this ratio across the ISP’s service territory.
Charts 14.1 and 14.2 show the complementary cumulative distribution of the ratio of median download
speed (over the peak usage period) to advertised download speed for each participating ISP. For each
ratio of actual to advertised download speed on the horizontal axis, the curves show the percentage of
panelists subscribing to each ISP that experienced at least this ratio.25 For example, the Cincinnati Bell
fiber curve in Chart 14.1 shows that 90% of its subscribers experienced a median download speed
exceeding 83% of the advertised download speed, while 70% experienced a median download speed
exceeding 95% of the advertised download speed, and 50% experienced a median download speed
exceeding 102% of the advertised download speed.
25
In Reports prior to the 2015 MBA Report, for each ratio of actual to advertised download speed on the horizontal
axis, the cumulative distribution function curves showed the percentage of measurements, rather than panelists
subscribing to each ISP, that experienced at least this ratio. The methodology used since then, i.e., using panelists
subscribing to each ISP, more accurately illustrates ISP performance from a consumer’s point of view.
28
Ninth Measuring Broadband America Fixed Broadband Report
Chart 14.1: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed.
Chart 14.2: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed (continued).
The curves for cable-based broadband and fiber-based broadband are steeper than those for DSL-based
broadband. This can be seen more clearly in Chart 14.3, which plots aggregate curves for each technology.
Approximately 80% of subscribers to cable and 60% of subscribers to fiber-based technologies experience
29
Ninth Measuring Broadband America Fixed Broadband Report
median download speeds exceeding the advertised download speed. In contrast, only 30% of subscribers
to DSL-based services experience median download speeds exceeding the advertised download speed.26
Chart 14.3: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed, by technology.
Charts 14.4 to 14.6 show the complementary cumulative distribution of the ratio of median upload speed
(over the peak usage period) to advertised upload speed for each participating ISP (Charts 14.4 and 14.5)
and by access technology (Chart 14.6).
26
The speed achievable by DSL depends on the distance between the subscriber and the central office. Thus, the
complementary cumulative distribution function will fall slowly unless the broadband ISP adjusts its advertised rate
based on the subscriber’s location. (Chart 16 illustrates that the performance during non-busy hours is similar to
the busy hour, making congestion less likely as an explanation.)
30
Ninth Measuring Broadband America Fixed Broadband Report
Chart 14.4: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed.
Chart 14.5: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed (continued).
31
Ninth Measuring Broadband America Fixed Broadband Report
Chart 14.6: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed, by technology.
All actual speeds discussed above were measured during peak usage periods. In contrast, Charts 15.1 and
15.2 below compare the ratio of actual download and upload speeds to advertised download and upload
speeds during peak and off-peak times.27 Charts 15.1 and 15.2 show that most ISP subscribers experience
only a slight degradation from off-peak to peak hour performance.
Chart 15.1: The ratio of weighted median download speed to advertised download speed, peak hours
versus off-peak hours.
27
As described earlier, Verizon DSL download and upload results are shown as a range since Verizon advertises its
DSL speed as a range rather than as a specific speed.
32
Ninth Measuring Broadband America Fixed Broadband Report
Chart 15.2: The ratio of weighted median upload speed to advertised upload speed, peak versus off-peak.
Chart 1628 below shows the actual download speed to advertised speed ratio in each two-hour time block
during weekdays for each ISP. The ratio is lowest during the busiest four-hour time block (7:00 p.m. to
11:00 p.m.).
28
In this chart, we have shown the median download speed of Verizon-DSL as a percentage of the midpoint of the
advertised speed range for its tier.
33
Ninth Measuring Broadband America Fixed Broadband Report
Chart 16: The ratio of median download speed to advertised download speed, Monday-to-Friday, two-
hour time blocks, terrestrial ISPs.
34
Ninth Measuring Broadband America Fixed Broadband Report
For each ISP, Chart 6 (in section 2.C) showed the ratio of the 80/80 consistent median download speed to
advertised download speed, and for comparison, Chart 4 showed the ratio of median download speed to
advertised download speed.
Chart 17.1 illustrates information concerning 80/80 consistent upload speeds. While all the upload 80/80
speeds were slightly lower than the median speed the differences were more marked in DSL. Charts 6
and 17.1 make it clear that cable and fiber technologies behaved more consistently than DSL technology
both for download as well as upload speeds.
Chart 17.1: The ratio of 80/80 consistent upload speed to advertised upload speed.
Charts 17.2 and 17.3 below illustrate similar consistency metrics for 70/70 consistent download and
upload speeds, i.e., the minimum download or upload speed (as a percentage of the advertised download
or upload speed) experienced by at least 70% of panelists during at least 70% of the peak usage period.
The ratios for 70/70 consistent speeds as a percentage of the advertised speed are higher than the
corresponding ratios for 80/80 consistent speeds. In fact, for many ISPs, the 70/70 consistent download
or upload speed is close to the median download or upload speed. Once again, ISPs using DSL technology
showed a considerably smaller value for the 70/70 download and upload speeds as compared to the
download and upload median speeds, respectively.
Chart 17.2: The ratio of 70/70 consistent download speed to advertised download speed.
35
Ninth Measuring Broadband America Fixed Broadband Report
Chart 17.3: The ratio of 70/70 consistent upload speed to advertised upload speed.
D. LATENCY
Chart 18 below shows the weighted median latencies, by technology and by advertised download speed
for terrestrial technologies. For all terrestrial technologies, latency varied little with advertised download
speed. DSL service typically had higher latencies, and lower latency was better correlated with advertised
download speed, than with either cable or fiber. Cable latencies ranged between 18ms to 24ms, fiber
latencies between 5ms to 12ms, and DSL between 27ms to 55ms.
Chart 18: Latency for Terrestrial ISPs, by technology, and by advertised download speed.
36
Ninth Measuring Broadband America Fixed Broadband Report
Chart 19.1: The ratio of median download speed to advertised download speed, by ISP (1-5 Mbps).
37
Ninth Measuring Broadband America Fixed Broadband Report
Chart 19.2: The ratio of median download speed to advertised download speed, by ISP (6-10 Mbps).
Chart 19.3: The ratio of median download speed to advertised download speed, by ISP (12-20 Mbps).
38
Ninth Measuring Broadband America Fixed Broadband Report
Chart 19.4: The ratio of median download speed to advertised download speed, by ISP (25-30 Mbps).
Chart 19.5: The ratio of median download speed to advertised download speed, by ISP (40-50 Mbps).
39
Ninth Measuring Broadband America Fixed Broadband Report
Chart 19.6: The ratio of median download speed to advertised download speed, by ISP (60-75 Mbps).
Chart 19.7: The ratio of median download speed to advertised download speed, by ISP (100-150 Mbps).
40
Ninth Measuring Broadband America Fixed Broadband Report
Chart 19.8: The ratio of median download speed to advertised download speed, by ISP (200-300 Mbps).
Charts 20.1 – 20.6 depict the ratio of median upload speeds to advertised upload speeds for each ISP by
service tier.
Chart 20.1: The ratio of median upload speed to advertised upload speed, by ISP (0.384 - 0.768 Mbps).
41
Ninth Measuring Broadband America Fixed Broadband Report
Chart 20.2: The ratio of median upload speed to advertised upload speed, by ISP (0.896 – 1.5 Mbps).
Chart 20.3: The ratio of median upload speed to advertised upload speed, by ISP (2-5 Mbps).
42
Ninth Measuring Broadband America Fixed Broadband Report
Chart 20.4: The ratio of median upload speed to advertised upload speed, by ISP (10 - 20 Mbps).
Chart 20.5: The ratio of median upload speed to advertised upload speed, by ISP (30 - 75 Mbps).
43
Ninth Measuring Broadband America Fixed Broadband Report
Chart 20.6: The ratio of median upload speed to advertised upload speed, by ISP (100-150 Mbps).
Table 2 lists the advertised download service tiers included in this study. For each tier, an ISP’s advertised
download speed is compared with the median of the measured download speed results. As we noted in
the past reports, the download speeds listed here are based on national averages and may not represent
the performance experienced by any particular consumer at any given time or place.
Table 2: Peak period median download speed, sorted by actual download speed
Advertised
Download Median Actual Speed / Advertised
Download Speed ISP
Speed (Mbps) Speed (%)
(Mbps)
44
Ninth Measuring Broadband America Fixed Broadband Report
45
Ninth Measuring Broadband America Fixed Broadband Report
B. VARIATIONS IN SPEED
In Section 3.C above, we present speed consistency metrics for each ISP based on test results averaged
across all service tiers. In this section, we provide detailed speed consistency results for each ISP’s
individual service tiers. Consistency of speed is important for services such as video streaming. A
significant reduction in speed for more than a few seconds can force a reduction in video resolution or an
intermittent loss of service.
Charts 21.1 – 21.3 below show the percentage of consumers that achieved greater than 95%, between
85% and 95%, or less than 80% of the advertised download speed for each ISP speed tier. Consistent with
past performance, ISPs using DSL technology frequently fail to deliver advertised service rates. ISPs quote
a single ‘up-to’ speed, but the actual speed of DSL depends on the distance between the subscriber and
the serving central office.
Cable companies and fiber-based systems, in general, showed a high consistency of speed.
46
Ninth Measuring Broadband America Fixed Broadband Report
Chart 21.1: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed, by service tier (DSL).
47
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 21.2: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed (cable).
48
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 21.3: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed (fiber).
Similarly, Charts 22.1 to 22.3 show the percentage of consumers that achieved greater than 95%, between
85% and 95%, or less than 80% of the advertised upload speed for each ISP speed tier.
49
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 22.1: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (DSL).
50
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 22.2: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (cable).
51
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 22.3: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (fiber).
52
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
In Section 3.C above, we present complementary cumulative distributions for each ISP based on test
results across all service tiers. Below, we provide tables showing selected points on these distributions
by each individual ISP. In general, DSL technology showed performance between 26% and 55% of
advertised speed for at least 95% of their subscribers. Among cable-based companies, the average
download speeds that at least 95% of their subscribers received were between 69% and 92% of advertised
rates. Fiber-based services provided a range from 73% to 98% of advertised download speeds for at least
95% of subscribers.
Table 3: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed by ISP
Table 4: Complementary cumulative distribution of the ratio of median upload speed to advertised upload
speed by ISP
53
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
54
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
55
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
56
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
57
Ninth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
58
Measuring Broadband America
Technical Appendix to the Ninth MBA Report
FCC’s Office of Engineering and Technology
Table of Contents
LIST OF TABLES
LIST OF FIGURES
This Appendix to the Ninth Measuring Broadband America Report,1 a report on consumer
wireline broadband performance in the United States, provides detailed technical background
information on the methodology that produced the Report. It covers the process by which the
panel of consumer participants was originally recruited and selected for the August 2011 MBA
Report, and maintained and evolved over the last nine years. This Appendix also discusses the
testing methodology used for the Report and describes how the test data was analyzed.
2 - PANEL CONSTRUCTION
This section describes the background of the study, as well as the methods employed to design
the target panel, select volunteers for participation, and manage the panel to maintain the
operational goals of the program.
The study aims to measure fixed broadband service performance in the United States as
delivered by an Internet Service Provider (ISP) to the consumer’s broadband modem. Many
factors contribute to end-to-end broadband performance, only some of which are under the
control of the consumer’s ISP. The methodology outlined here is focused on the measurement
of broadband performance within the scope of an ISP’s network, and specifically focuses on
measuring performance from the consumer Internet access point, or consumer gateway, to a
close major Internet gateway point. The actual quality of experience seen by consumers depends
on many other factors beyond the consumer’s ISP, including the performance of the consumer’s
in-home network, transit providers, interconnection points, content distribution networks (CDN)
and the infrastructure deployed by the providers of content and services. The design of the study
methodology allows it to be integrated with other technical measurement approaches that focus
on specific aspects of broadband performance (i.e., download speed, upload speed, latency,
packet loss), and in the future, could focus on other aspects of broadband performance.
1
The First Report (2011) was based on measurements taken in March 2011, the Second Report (2012) on
measurements taken in April 2012, and the Third (2013) through this, the Ninth (2018) Reports on measurements
taken in September of the year prior to the reports’ release dates.
2
See [Link] (last accessed June 21, 2016).
3
SamKnows is a company that specializes in broadband availability measurement and was retained under contract
by the FCC to assist in this study. See [Link]
4
The Whiteboxes are named after the appearance of the first hardware implementation of the measurement agent.
The Whiteboxes remain in consumer homes and continue to run the tests described in this report. Participants may
remain in the measurement project as long as it continues, and may retain their Whitebox when they end their
participation.
5
[Link]
6
At the request of, and with the cooperation of the Department of Commerce and Consumer Affairs, Hawaii, we
have begun to collect data from the state of Hawaii. Data from Hawaii has been included in this year’s report.
7
Although the Commission’s volunteer recruitment was guided by Census Region to ensure the widest possible
distribution of panelists throughout the United States, as discussed below, a sufficient number of testing devices
were not deployed to enable, in every case, the evaluation of regional differences in broadband performance. The
States associated with each Census Region are described in Table 4.
8
The FCC Form 477 data collects information about broadband connections to end user locations, wired and wireless
local telephone services, and interconnected Voice over Internet Protocol (VoIP) services. See
[Link] for further information.
9
Subscriber data in the Ninth MBA Report is based on the FCC’s Internet Access Services Report with data current
to June 30, 2017. See Internet Access Services: Status as of June 30, 2017, Wireline Competition Bureau, Industry
Analysis and Technology Division (rel. Nov. 2018), available at [Link]
[Link].
% of Total US
State Total Boxes % of Total Boxes
Broadband
Alabama 26 0.8% 1.4%
Arizona 112 3.5% 2.1%
Arkansas 26 0.8% 0.8%
California 246 7.7% 11.6%
Colorado 94 2.9% 1.9%
Connecticut 73 2.3% 1.2%
Delaware 10 0.3% 0.3%
District of Columbia 5 0.2% 0.2%
Florida 146 4.6% 7.1%
Georgia 119 3.7% 3.1%
Hawaii 23 0.7% 0.4%
Idaho 24 0.8% 0.5%
Illinois 57 1.8% 3.9%
Indiana 42 1.3% 2.0%
Iowa 128 4.0% 1.0%
Kansas 21 0.7% 0.9%
Kentucky 130 4.1% 1.3%
Louisiana 21 0.7% 1.3%
Maine 2 0.1% 0.5%
Maryland 53 1.7% 2.0%
Massachusetts 51 1.6% 2.4%
Michigan 45 1.4% 3.1%
Minnesota 85 2.7% 1.8%
Mississippi 8 0.3% 0.7%
Missouri 63 2.0% 1.8%
Montana 5 0.2% 0.3%
The distribution of Whiteboxes by Census Region is found in the table on the next page.
Census Region Total Boxes % Total Boxes % Total U.S. Broadband Subscribers
The distribution of states associated with the four Census Regions used to define the panel strata
are included in the table below.
Northeast CT MA ME NH NJ NY PA RI VT
Midwest IA IL IN KS MI MN MO ND NE OH SD WI
AL AR DC DE FL GA KY LA MD MS NC OK SC TN TX
South
VA WV
West AK AZ CA CO HI ID MT NM NV OR UT WA WY
• Recruitment has evolved since the start of the program. At that time, (2011) several
thousand volunteers were initially recruited through an initial public relations and social
media campaign led by the FCC. This campaign included discussion on the FCC website
and on technology blogs, as well as articles in the press. Currently volunteers are drafted
with the help of a recruitment website10 which keeps them informed about the MBA
program and allows them to view MBA data on a dashboard. The composition of the
panel is reviewed each year to identify any deficiencies with regard to the sample plan
described above. Target demographic goals are set for volunteers based on ISP, speed
tier, technology type, and region. Where the pool of volunteers falls short of the desired
goal, ISPs send out email messages to their customers asking them to participate in the
MBA program. The messages direct interested volunteers to contact SamKnows to
request participation in the trial. The ISPs do not know which of the email recipients
volunteer. In almost all cases, this ISP outreach allows the program to meet its desired
demographic targets.
The mix of panelists recruited using the above methodologies varies by ISP.
A multi-mode strategy was used to qualify volunteers for the 2018 testing period. The key stages
of this process were as follows:
1. Volunteers were directed to complete an online form which provided information on the
study and required volunteers to submit a small amount of information.
2. Volunteers were selected from respondents to this follow-up email based on the target
requirements of the panel. Selected volunteers were then asked to agree to the User
Terms and Conditions that outlined the permissions to be granted by the volunteer in key
areas such as privacy.11
3. From among the volunteers who agreed to the User Terms and Conditions, SamKnows
selected the panel of participants,12 each of whom received a Whitebox for self-
installation. SamKnows provided full support during the Whitebox installation phase.
10
The Measuring Broadband America recruitment website is: [Link]
11
The User Terms and Conditions is found in the Reference Documents at the end of this Appendix.
12
Over 23,000 Whiteboxes have been shipped to targeted volunteers since 2011, of which 5,855 were online and
reporting data used in the Ninth Report from the months of September/October 2018.
SamKnows manually completed the following four steps for each panelist:
• Verified that the IP address was in a valid range for those served by the ISP.
• Reviewed data for each panelist and removed data where speed changes such as tier
upgrade or downgrade appeared to have occurred, either due to a service change on the
part of the consumer or a network change on the part of the ISP.
• Identified panelists whose throughput appeared inconsistent with the provisioned service
tier. Such anomalies were re-certified with the consumer’s ISP.14
• Verified that the resulting downstream-upstream test results corresponded to the ISP-
provided speed tiers, and updated accordingly if required.
13
Past FCC studies found that a high rate of consumers could not reliably report information about their broadband
service, and the validation of subscriber information ensured the accuracy of expected speed and other subscription
details against which observed performance was measured. See John Horrigan and Ellen Satterwhite, Americans’
Perspectives on Online Connection Speeds for Home and Mobile Devices, 1 (FCC 2010), available at
[Link] (finding that 80 percent of broadband
consumers did not know what speed they had purchased).
14
For example, when a panelist’s upload or download speed was observed to be significantly higher than that of
the rest of the tier, it could be inferred that a mischaracterization of the panelist’s service tier had occurred. Such
anomalies, when not resolved in cooperation with the service provider, were excluded from the Ninth Report, but
will be included in the raw bulk data set.
15
This figure represents the total number of boxes reporting during September/October 2018, the month chosen
for the Ninth Report. Shipment of boxes continued in succeeding months and these results will be included in the
raw bulk data set.
This section describes the system architecture and network programming features of the tests,
and other technical aspects of the methods employed to measure broadband performance
during this study.
• If software tests are performed manually, panelists might only run tests when they
experience problems and thus bias the results.
In contrast, the hardware approach used in the MBA program requires the placement of the
previously described Whitebox inside the user’s home, directly connected to the consumer’s
service interconnection device (router), via Ethernet cable. The measurement device therefore
directly accesses fixed Internet service to the home over this dedicated interface and periodically
runs tests to remote targets over the Internet. The use of hardware devices avoids the
disadvantages listed earlier with the software approach. However, hardware approaches are
much more expensive than the software alternative, are thus more constrained in the achievable
panel size, and require correct installation of the device by the consumer or a third party. This is
still subject to unintentional errors due to misconfigurations, i.e., connecting the Whitebox
incorrectly but these can often be detected in the validation process that follows installation. The
FCC chose the hardware approach since its advantages far outweigh these disadvantages.
The Whitebox measurement process The Whitebox measurement process is designed to provide
1 must not change during the monitoring automated and consistent monitoring throughout the
period. measurement period.
Must be compatible with a wide range Whiteboxes can be connected to all modem types commonly
8 of DSL, cable, satellite and fiber-to-the- used to support broadband services in the U.S., either in a
home modems. routing or bridging mode, depending on the model.
Must be upgradeable remotely if it The Whitebox can be completely controlled remotely for
11 contains any software or firmware updates without involvement of the consumer, providing the
components. Whitebox is switched on and connected.
16
Signatories to the Code of Conduct are: AT&T, CenturyLink, Charter, Cincinnati Bell, Comcast, Cox, Frontier,
Hughes, Level3, Measurement Lab, Mediacom, NCTA, Optimum, Time Warner Cable, Verizon, ViaSat, and
Windstream. A copy of the Code of Conduct is included as a Reference Document attached to this Appendix.
17
Each reporting interface included a data dashboard for the consumer volunteers, which provided performance
metrics associated with their Whitebox.
18
The use of legacy equipment has the potential to impede some panelists from receiving the provisioned speed
from their ISP, and this impact is captured by the survey.
AT&T 6
Comcast 36
Cox 2
Frontier 5
Hawaiian Telecom 1
Level 3 (off-net) 11
M-Lab (off-net) 45
Mediacom 1
Optimum 1
Uhnet (Hawaii) 1
Verizon 2
Windstream 4
19
QWest was reported separately from Centurylink in reports prior to 2016. The entities completed merging their
test infrastructure in 2016.
20
Time Warner Cable was reported separately from Charter in reports prior to the Eighth report. The entities
completed merging their test infrastructure in early 2018.
21
Specific questions on test procedures may be addressed to team@[Link].
22
Other tests may be run on the MBA panel; this list outlines the published tests in the report.
23
An “A record” is the numeric IP address associated with a domain address such as [Link]
Web Browsing
The test records the averaged time taken to sequentially download the HTML and referenced
resources for the home page of each of the target websites, the number of bytes transferred,
and the calculated rate per second. The primary measure for this test is the total time taken to
download the HTML front page for each web site and all associated images, JavaScript, and
stylesheet resources. This test does not measure against the centralized testing nodes; instead
• [Link] • [Link]
• [Link] • [Link]
• [Link] • [Link]
• [Link] • [Link]
The results include the time needed for DNS resolution. The test uses up to eight concurrent TCP
connections to fetch resources from targets. The test pools TCP connections and utilizes
persistent connections where the remote HTTP server supports them.
The client advertises the user agent as Microsoft Internet Explorer 10. Each website is tested in
sequence and the results summed and reported across all sites.
24
These websites were chosen based on a list by Alexa, [Link] of the top twenty websites in
October 2010.
Voice over IP
The Voice over IP (VoIP) test operates over UDP and utilizes bidirectional traffic, as is typical for
voice calls.
The Whitebox handshakes with the server, and each initiates a UDP stream with the other. The
test uses a 64 kbps stream with the same characteristics and properties (i.e., packet sizes, delays,
bitrate) as the G.711 codec. 160 byte packets are used. The test measures jitter, delay, and loss.
Jitter is calculated using the Packet Delay Variation (PDV) approach described in section 4.2 of
RFC 5481. The 99th percentile is recorded and used in all calculations when deriving the PDV.
Traceroute
A traceroute client is used to send UDP probes to each hop in the path between client and
destination. Three probes are sent to each hop. The round-trip times, the standard deviation of
the round-trip times of the responses from each hop and the packet loss are recorded. The open
source traceroute client "mtr" ([Link] is used for carrying out the
traceroute measurements.
Upload Speed (Single TCP 1 off-net test Once in peak hours, Fixed 10 6 MB at
connection) node once in off-peak seconds 1 Mbps
1 on-net test hours
node
UDP Latency 2 off-net test Hourly, 24x7 Permanent 5.8 MB
nodes
(Level3/MLab)
1 on-net test Hourly, 24x7 Permanent 2.9 MB
node
UDP Packet Loss 2 off-net test Hourly, 24x7 Permanent N/A (uses
node above)
1 on-net test Hourly, 24x7 Permanent N/A (uses
nodes above)
Consumption N/A 24x7 N/A N/A
DNS Resolution 10 popular US Hourly, 24x7 Est. 3 0.3 MB
websites seconds
ICMP Latency 1 off-net test Hourly, 24x7 Est. 5 0.3 MB
node seconds
1 on-net test
node
ICMP Packet loss 1 off-net test Hourly, 24x7 N/A (As N/A (uses
node IMCP above)
1 on-net test latency)
node
Traceroute 1 off-net test Three times a day, N/A N/A
node 24x7
1 on-net test
node
Download Speed 1 off-net test Three times a day Fixed 10 180 MB at
IPv6^^ node seconds 50 Mbps
72 MB at
20 Mbps
11 MB at
3 Mbps
5.4 MB at
1.5 Mbps
Upload Speed 1 off-net test Three times a day Fixed 10 172 MB at
IPv6^^ node seconds 2 Mbps
3.6MB at
1 Mbps
1.8MB at
0.5 Mbps
UDP Latency / Loss 2 off-net test Hourly, 24x7 Permanent 5.8 MB
IPv6^^ nodes
(Level3/MLab)
Lightweight Capacity Test 1 off-net test Once 12am-6am, Fixed 1000 9MB
– Download (UDP) node packets
Lightweight capacity test – 1 off-net test Once 12am-6am, Fixed 1000 9MB
Upload (UDP) node Once 6am-12pm, packets
Once 12pm-6pm,
Hourly thereafter
**Download/upload daily volumes are estimates based upon likely line speeds. All tests will operate
at maximum line rate so actual consumption may vary.
^Currently in beta testing.
^^Only carried out on broadband connections that support IPv6.
Tests to the off-net destinations alternate randomly between Level3 and M-Lab, except that
latency and loss tests operate continuously to both Level3 and M-Lab off-net servers. All tests
are also performed to the closest on-net server, where available.
Consumption
This test was replaced by the new data usage test. A technical description for this test is
outlined here: [Link]
08-24_Final-[Link]
This section describes the background for the categorization of data gathered for the Ninth
Report, and the methods employed to collect and analyze the test results.
4.1 - BACKGROUND
Time of Day
Most of the metrics reported in the Ninth Report draw on data gathered during the so-called
peak usage period of 7:00 p.m. to 11:00 p.m. local time25. This time period is generally considered
to experience the highest amount of Internet usage under normal circumstances.
(a) The speed tier must make up the top 80% of the ISP’s subscriber base;
(b) There must be a minimum of 45 panelists that are recruited for that tier who have
provided valid data for the tier within the validation period; and
(c) Each panelist must have a minimum of five days of valid data within the validation period.
The study achieved target sample sizes for the following download and upload speeds26 (listed in
alphabetical order by ISP):
Download Speeds:
AT&T IP-BB: 6 and 18 Mbps tiers;
25
This period of time was agreed to by ISP participants in open meetings conducted at the beginning of the program.
26
Due to the large number of different combinations of upload/download speed tiers supported by ISPs where, for
example, a single download speed might be offered paired with multiple upload speeds or vice versa, upload and
download test results were analyzed separately.
Upload Speeds:
AT&T IP-BB: 1 and 1.5 Mbps tiers;
CenturyLink: 0.768, 0.896, 2, and 5 Mbps tiers;
Charter: 5, 10, and 20 Mbps tiers;
Cincinnati Bell DSL: 0.768 and 3 Mbps tiers;
Cincinnati Bell Fiber: 10 and 100 Mbps tiers;
Comcast: 5 and 10 Mbps tiers;
Cox: 3, 10, and 30 Mbps tiers;
Frontier DSL: 0.768 and 1 Mbps tiers;
Frontier Fiber: 50, 75, 100, and 150 Mbps tiers;
Hughes: 1 and 3 Mbps tiers;
Mediacom: 5, and 10 Mbps tiers;
Optimum: 35 Mbps tier;
Verizon DSL: [0.384 – 0.768] Mbps tier;
Verizon Fiber: 50, 75, 100, and 1 Gbps tiers;28
Windstream: 0.768 and 1.5 Mbps tiers.
27
Verizon’s 1 Gbps tier was not included in the final report. 1Gbps tiers may be included in a separate/subsequent
report focusing on faster speeds.
28
Verizon’s 1 Gbps tier was not included in the final report. Id at n. 27.
29
See: [Link]
Legacy Equipment
In previous reports, we discussed the challenges ISPs face in improving network performance
where equipment under the control of the subscriber limits the end-to-end performance
achievable by the subscriber.30 Simply, some consumer-controlled equipment may not be
capable of operating fully at new, higher service tiers. Working in open collaboration with all
service providers we developed a policy permitting changes in ISP panelists when their installed
modems were not capable of meeting the delivered service speed that included several
conditions on participating ISPs. First, proposed changes in consumer panelists would only be
considered where an ISP was offering free upgrades for modems they owned and leased to the
consumer. Second, each ISP needed to disclose its policy regarding the treatment of legacy
modems and its efforts to inform consumers regarding the impact such modems may have on
30
See pgs. 8-9, 2014 Report, pg. 8 of the 2013 Report, as well as endnote 14. [Link]
broadband-america/2012/july.
While the issue of DOCSIS 3 modems and network upgrades affect the cable industry today, we
may see other cases in the future where customer premises equipment affects the achievable
network performance.
In accordance with the above stated policy, 95 Whiteboxes connected to legacy modems were
identified and removed from the final data set in order to ensure that the study would only
include equipment that would be able to meet its advertised speed. The 95 excluded Whiteboxes
were connected to Charter, Comcast, and Cox.
31
These methods were reviewed with statistical experts by the participating ISPs.
Limiting Factors
A total of 8,417,695,058 measurements were taken across 144,636,223 unique tests.
All scheduled tests were run, aside from when monitoring units detected concurrent use of
bandwidth.
Schedules were adjusted when required for specific tests to avoid triggering data usage limits
applied by some ISPs.
Raw Data: Raw data for the chosen period is collected from the measurement database. The ISPs
and products that panelists were on are exported to a “unit profile” file, and those
that changed during the period are flagged. 2018 Raw Data Links
Data is cleaned. This includes removing measurements when a user changed ISP or
Validated Data tier during the period. Anomalies and significant outliers are also removed at this
Cleansing: point. A data cleansing document describes the process in detail. 2018 Data Cleansing
Document Link
Per-unit results are generated for each metric. Time-of-day averages are computed
and a trimmed median is calculated for each metric. The SQL scripts used here are
SQL Processing:
contained in SQL processing scripts available with the release of each report. 2018
SQL Processing Links
This document identifies the various details of each test unit, including ISP,
technology, service tier, and general location. Each unit represents one volunteer
Unit Profile:
panelists. The unit ID's were randomly generated, which served to protect the
anonymity of the volunteer panelists. 2018 Unit Profile link
A listing of units excluded from the analysis due to insufficient sample size for that
Excluded Units:
particular ISP’s speed tier. 2018 Excluded Units Link
This step identifies the census block (for blocks containing more than 1,000 people) in
which each unit running tests is located. Census block is from 2010 census and is in
the FIPS code format. We have used block FIPS codes for blocks that contains more
Unit Census
than 1,000 people. For blocks with fewer than 1,000 people we have aggregated to
Block:
the next highest level, i.e., tract, and used the Tract FIPS code, provided there are more
than 1,000 people in the tract. In cases where there are less than 1,000 people in a
tract we have aggregated to Regional level. 2018 Unit Census Block Link.
Excel Tables & Summary data tables and charts in Excel are produced from the averages. These are
Charts: used directly in the report 2018 Statistical Averages Links
The raw data collected for each active metric is made available by month in tarred gzipped files.
The files in the archive containing active metrics are described in table 9.
curr_dlping.csv
unit_id Unique identifier for an individual unit
dtime Time test finished
target Target hostname or IP address
32
This data dictionary is also available on the FCC Measuring Broadband America website, located with the other
validated data files available for download.
curr_lct_dl.csv
unit_id Unique identifier for an individual unit
dtime Time test finished in UTC
curr_lct_ul.csv
unit_id Unique identifier for an individual unit
dtime Time test finished in UTC
target Target hostname
address Target IP address
packets_received Total number of packets received
packets_sent Total number of packets sent
packet_size Packet size
bytes_total Total number of bytes
duration Duration of the test in microseconds
bytes_sec Throughput in bytes/sec
error_code An internal error code from the test.
successes Number of successes (always 1 or 0 for this test)
failures Number of failures (always 1 or 0 for this test)
location_id Please ignore (this is an internal key mapping to
unit profile data)
5 - REFERENCE DOCUMENTS
PLEASE READ THESE TERMS AND CONDITIONS CAREFULLY. BY APPLYING TO BECOME A PARTICIPANT
IN THE BROADBAND COMMUNITY PANEL AND/OR INSTALLING THE WHITEBOX, YOU ARE AGREEING TO
THESE TERMS AND CONDITIONS.
1. Interpretation
1.1. The following definitions and rules of interpretation apply to these terms & conditions.
Connection: the Participant's own broadband internet connection, provided by an Internet Service
Provider ("ISP").
Connection Equipment: the Participant's broadband router or cable modem, used to provide the
Participant's Connection.
Intellectual Property Rights: all patents, rights to inventions, utility models, copyright and related rights,
trademarks, service marks, trade, business and domain names, rights in trade dress or get-up, rights in
goodwill or to sue for passing off, unfair competition rights, rights in designs, rights in computer software,
database right, moral rights, rights in confidential information (including know-how and trade secrets)
and any other intellectual property rights, in each case whether registered or unregistered and including
all applications for and renewals or extensions of such rights, and all similar or equivalent rights or forms
of protection in any part of the world.
ISP: the company providing broadband internet connection to the Participant during the term of this
Program.
Participant/You/Your: the person who volunteers to participate in the Program, under these terms and
conditions. The Participant must be the named account holder on the Internet service account with the
ISP.
Participant's Equipment: any equipment, systems, cabling or facilities provided by the Participant and
used directly or indirectly in support of the Services, excluding the Connection Equipment.
Requirements: the requirements specified by SamKnows as part of the sign-up process that the
Participant must fulfil in order to be selected to receive the Services.
SamKnows/We/Our: the organization providing the Services and conducting the Program, namely:
SamKnows Limited (Co. No. 6510477) of 25 Harley Street, London W1G 9BR
Services / Program: the performance and measurement of certain broadband and Internet services and
research program (Broadband Community Panel), as sponsored by the Federal Communications
Committee (FCC), in respect of measuring broadband Internet Connections.
Software: the software that has been installed and/or remotely uploaded onto the Whitebox, by
SamKnows as updated by SamKnows, from time to time, but not including any Open Source Software.
Whitebox: the hardware supplied to the Participant by SamKnows with the Software.
1.2. Headings in these terms and conditions shall not affect their interpretation.
1.3. A person includes a natural person, corporate or unincorporated body (whether or not having
separate legal personality).
[Link] obligation in these terms and conditions on a person not to do something includes, without
limitation, an obligation not to agree, allow, permit or acquiesce in that thing being done.
2.1 Subject to the Participant complying fully with these terms and conditions, SamKnows shall use
reasonable care to:
(a) provide the Participant with the Measurement Services under these terms and conditions;
(c) if requested, SamKnows will provide a pre-paid postage label for the Whitebox to be returned.
(d) comply with all applicable United States, European Union, and United Kingdom privacy laws and
directives, and will access, collect, process and distribute the information according to the following
principles:
Specific purpose: We will access, collect, process, store and distribute data for the purposes and reasons
specified in this agreement and not in ways incompatible with those purposes;
Restricted: We will restrict our data collection and use practices to those adequate and relevant, and not
excessive in relation to the purposes for which we collect the information;
Accurate: We will work to ensure that the data we collect is accurate and up-to-date, working with
Participant and his/her ISP;
Destroyed when obsolete: We will not maintain personal data longer than is necessary for the purposes
for which we collect and process the information;
Security: We will collect and process the information associated with this trial with adequate security
through technical and organizational measures to protect personal data against destruction or loss,
alteration, unauthorized disclosure or access, in particular where the processing involves the transmission
of data over a network.
(a) provide Participant with access to a Program-specific customer services email address, which the
Participant may use for questions and to give feedback and comments;
(b) provide Participant with a unique login and password in order to access to an online reporting system
for access to Participant's broadband performance statistics.
(c) provide Participant with a monthly email with their specific data from the Program or notifying
Participant that their individual data is ready for viewing;
(d) provide Participant with support and troubleshooting services in case of problems or issues with their
Whitebox;
(e) notify Participant of the end of the FCC-sponsored Program and provide a mechanism for Participant
to opt out of any further performance/measuring services and research before collecting any data after
termination of the Program;
(f) use only data generated by SamKnows through the Whitebox, and not use any Participant data for
measuring performance without Participant's prior written consent; and
Federal Communications Commission 55 Measuring Broadband America
Technical Appendix to the Ninth MBA Report
(g) not monitor/track Participant's Internet activity without Participant's prior written consent.
2.3 While SamKnows will make all reasonable efforts to ensure that the Services cause no disruption to
the performance of the Participant's broadband Connection, including only running tests when there is
no concurrent network activity generated by users at the Participant's location. The Participant
acknowledges that the Services may occasionally impact the performance of the Connection and agrees
to hold SamKnows and their ISP harmless for any impact the Services may have on the performance of
their Connection.
3. Participant's Obligations
3.1 The Participant is not required to pay any fee for the provision of the Services by SamKnows or to
participate in the Program.
(a) connect the Whitebox to their Connection Equipment within 14 days of receiving it;
(b) not to unplug or disconnect the Whitebox unless (i) they will be absent from the property in which it
is connected for more than 3 days and/or (ii) it is reasonably necessary for maintenance of the
Participant's Equipment and the Participant agrees that they shall use reasonable endeavors to minimize
the length of time the Whitebox is unplugged or disconnected;
(c) in no way reverse engineer, tamper with, dispose of or damage the Whitebox, or attempt to do so;
(d) notify SamKnows within 7 days in the event that they change their ISP or their Connection tier or
package (for example, downgrading/upgrading to a different broadband package), to the email address
provided by SamKnows;
(e) inform SamKnows of a change of postal or email address by email; within 7 days of the change, to the
email address provided by SamKnows;
(f) agrees that the Whitebox may be upgraded to incorporate changes to the Software and/or additional
tests at the discretion of SamKnows, whether by remote uploads or otherwise;
(g) on completion or termination of the Services, return the Whitebox to SamKnows by mail, if requested
by SamKnows. SamKnows will provide a pre-paid postage label for the Whitebox to be returned;
(h) be an active part of the Program and as such will use all reasonable endeavors to complete the market
research surveys received within a reasonable period of time;
(i) not publish data, give press or other interviews regarding the Program without the prior written
permission of SamKnows; and
(k) contact SamKnows directly, and not your ISP, in the event of any issues or problems with the Whitebox,
by using the email address provided by SamKnows.
3.4 The Participant acknowledges that he/she is not an employee or agent of, or relative of, an employee
or agent of an ISP or any affiliate of any ISP. In the event that they become one, they will inform
SamKnows, who at its complete discretion may ask for the immediate return of the Whitebox.
3.5 THE PARTICIPANT'S ATTENTION IS PARTICULARLY DRAWN TO THIS CONDITION. The Participant
expressly consents to having their ISP provide to SamKnows and the Federal Communications (FCC)
information about the Participant's broadband service, for example: service address, speed tier, local loop
length (for DSL customers), equipment identifiers and other similar information, and hereby waives any
claim that its ISPs disclosure of such information to SamKnows or the FCC constitutes a violation of any
right or any other right or privilege that the Participant may have under any federal, state or local statute,
law, ordinance, court order, administrative rule, order or regulation, or other applicable law, including,
without limitation, under 47 U.S.C. §§ 222 and 631 (each a "Privacy Law"). If notwithstanding Participant's
consent under this Section 3.5, Participant, the FCC or any other party brings any claim or action against
any ISP under a Privacy Law, upon the applicable ISPs request SamKnows promptly shall cease collecting
data from such Participant and remove from its records all data collected with respect to such Participant
prior to the date of such request, and shall not provide such data in any form to the FCC. The Participant
further consents to transmission of information from this Program Internationally, including the
information provided by the Participant's ISP, specifically the transfer of this information to SamKnows in
the United Kingdom, SamKnows' processing of it there and return to the United States.
4.1 All Intellectual Property Rights relating to the Whitebox are the property of its manufacturer. The
Participant shall use the Whitebox only to allow SamKnows to provide the Services.
4.2 As between SamKnows and the Participant, SamKnows owns all Intellectual Property Rights in the
Software. The Participant shall not translate, copy, adapt, vary or alter the Software. The Participant shall
use the Software only for the purposes of SamKnows providing the Services and shall not disclose or
otherwise use the Software.
4.3 Participation in the Broadband Community Panel gives the participant no Intellectual Property Rights
in the Test Results. Ownership of all such rights is governed by Federal Acquisition Regulation Section
52.227-17, which has been incorporated by reference in the relevant contract between SamKnows and
the FCC. The Participant hereby acknowledges and agrees that SamKnows may make such use of the Test
Results as is required for the Program.
4.4 Certain core testing technology and aspects of the architectures, products and services are developed
and maintained directly by SamKnows. SamKnows also implements various technical features of the
measurement services using particular technical components from a variety of vendor partners including:
NetGear, Measurement Lab, TP-Link.
5. SamKnows' Property
6.1 This condition 6 sets out the entire financial liability of SamKnows (including any liability for the acts
or omissions of its employees, agents, consultants, and subcontractors) to the Participant, including and
without limitation, in respect of:
(a) any use made by the Participant of the Services, the Whitebox and the Software or any part of them;
and
(b) any representation, statement or tortious act or omission (including negligence) arising under or in
connection with these terms and conditions.
6.2 All implied warranties, conditions and other terms implied by statute or other law are, to the fullest
extent permitted by law, waived and excluded from these terms and conditions.
6.3 Notwithstanding the foregoing, nothing in these terms and conditions limits or excludes the liability
of SamKnows:
(a) for death or personal injury resulting from its negligence or willful misconduct;
(b) for any damage or liability incurred by the Participant as a result of fraud or fraudulent
misrepresentation by SamKnows;
(d) in relation to any other liabilities which may not be excluded or limited by applicable law.
6.4 Subject to condition 6.2 and condition 6.3, SamKnows' total liability in contract, tort (including
negligence or breach of statutory duty), misrepresentation, restitution or otherwise arising in connection
with the performance, or contemplated performance, of these terms and conditions shall be limited to
$100.
6.5 In the event of any defect or modification in the Whitebox, the Participant's sole remedy shall be the
repair or replacement of the Whitebox at SamKnows' reasonable cost, provided that the defective
Whitebox is safely returned to SamKnows, in which case SamKnows shall pay the Participant's reasonable
postage costs.
6.6 The Participant acknowledges and agrees that these limitations of liability are reasonable in all the
circumstances, particularly given that no fee is being charged by SamKnows for the Services or
participation in the Program.
7.1 The Participant acknowledges and agrees that his/her personal data, such as service tier, address and
line performance, will be processed by SamKnows in connection with the program.
7.2 Except as required by law or regulation, SamKnows will not provide the Participant's personal data to
any third party without obtaining Participant's prior consent. However, for the avoidance of doubt, the
Participant acknowledges and agrees that subject to the privacy polices discussed below, the specific
technical characteristics of tests and other technical features associated with the Internet Protocol
environment of architecture, including the client's IP address, may be shared with third parties as
necessary to conduct the Program and all aggregate statistical data produced as a result of the Services
(including the Test Results) may be provided to third parties.
7.3 You acknowledge and agree that SamKnows may share some of Your information with Your ISP, and
request information about You from Your ISP so that they may confirm Your service tiers and other
information relevant to the Program. Accordingly You hereby expressly waive claim that any disclosure by
Your ISP to SamKnows constitutes a violation of any right or privilege that you may have under any law,
wherever it might apply.
8.1 This Agreement shall continue until terminated in accordance with this clause.
8.2 Each party may terminate the Services immediately by written notice to the other party at any
time. Notice of termination may be given by email. Notices sent by email shall be deemed to be served
on the day of transmission if transmitted before 5.00 pm Eastern Time on a working day, but otherwise
on the next following working day.
(a) SamKnows shall have no further obligation to provide the Services; and
(b) the Participant shall safely return the Whitebox to SamKnows, if requested by SamKnows, in which
case SamKnows shall pay the Participant's reasonable postage costs.
8.4 Notwithstanding termination of the Services and/or these terms and conditions, clauses 1, 3.3 and 4
to 14 (inclusive) shall continue to apply.
9. Severance
10.1 These terms and conditions constitute the whole agreement between the parties and replace and
supersede any previous agreements or undertakings between the parties.
10.2 Each party acknowledges that, in entering into these terms and conditions, it has not relied on, and
shall have no right or remedy in respect of, any statement, representation, assurance or warranty.
11. Assignment
11.1 The Participant shall not, without the prior written consent of SamKnows, assign, transfer, charge,
mortgage, subcontract all or any of its rights or obligations under these terms and conditions.
11.2 Each party that has rights under these terms and conditions acknowledges that they are acting on
their own behalf and not for the benefit of another person.
Nothing in these terms and conditions is intended to, or shall be deemed to, constitute a partnership or
joint venture of any kind between any of the parties, nor make any party the agent of another party for
any purpose. No party shall have authority to act as agent for, or to bind, the other party in any way.
Except for the rights and protections conferred on ISPs under these Terms and Conditions which they may
defend, a person who is not a party to these terms and conditions shall not have any rights under or in
connection with these Terms and Conditions.
14.1 For the avoidance of doubt, the release of IP protocol addresses of client's Whiteboxes are not PII
for the purposes of this program and the client expressly consents to the release of IP address and other
technical IP protocol characteristics that may be gathered within the context of the testing architecture.
SamKnows, on behalf of the FCC, is collecting and storing broadband performance information, including
various personally identifiable information (PII) such as the street addresses, email addresses, sum of data
transferred, and broadband performance information, from those individuals who are participating
voluntarily in this test. PII not necessary to conduct this study will not be collected. Certain information
provided by or collected from you will be confirmed with a third party, including your ISP, to ensure a
representative study and otherwise shared with third parties as necessary to conduct the
program. SamKnows will not release, disclose to the public, or share any PII with any outside entities,
including the FCC, except as is consistent with the SamKnows privacy policy or these Terms and
Conditions. See [Link] The broadband performance
14.2 The FCC is soliciting and collecting this information authorized by OMB Control No. 3060-1139 in
accordance with the requirements and authority of the Paperwork Reduction Act, Pub. L. No. 96-511, 94
Stat. 2812 (Dec. 11, 1980); the Broadband Data Improvement Act of 2008, Pub. L. No. 110-385, Stat 4096
§ 103(c)(1); American Reinvestment and Recovery Act of 2009 (ARRA), Pub. L. No. 111-5, 123 Stat 115
(2009); and Section 154(i) of the Communications Act of 1934, as amended.
14.3 Paperwork Reduction Act of 1995 Notice. We have estimated that each Participant of this study will
assume a one hour time burden over the course of the Program. Our estimate includes the time to sign-
up online, connect the Whitebox in the home, and periodic validation of the hardware. If you have any
comments on this estimate, or on how we can improve the collection and reduce the burden it causes
you, please write the Federal Communications Commission, Office of Managing Director, AMD-PERM,
Washington, DC 20554, Paperwork Reduction Act Project (3060-1139). We will also accept your comments
via the Internet if you send an e-mail to PRA@[Link]. Please DO NOT SEND COMPLETED APPLICATION
FORMS TO THIS ADDRESS. You are not required to respond to a collection of information sponsored by
the Federal government, and the government may not conduct or sponsor this collection, unless it
displays a currently valid OMB control number and provides you with this notice. This collection has been
assigned an OMB control number of 3060-1139. THIS NOTICE IS REQUIRED BY THE PAPERWORK
REDUCTION ACT OF 1995, PUBLIC LAW 104-13, OCTOBER 1, 1995, 44 U.S.C. SECTION 3507. This notice
may also be found at [Link]
15. Jurisdiction
These terms and conditions shall be governed by the laws of the state of New York.
SCHEDULE
THE SERVICES
Subject to the Participant complying with its obligations under these terms and conditions, SamKnows
shall use reasonable endeavors to test the Connection so that the following information is recorded:
1. Web browsing
2. Video streaming
3. Voice over IP
4. Download speed
5. Upload speed
6. UDP latency
7. UDP packet loss
8. Consumption
1. SamKnows will perform tests on the Participant's Connection by using SamKnows' own data and will
not monitor the Participant's content or internet activity. The purpose of this study is to measure the
Connection and compare this data with other consumers to create a representative index of US
broadband performance.
WHEREAS the Federal Communications Commission of the United States of America (FCC) is
conducting a Broadband Testing and Measurement Program, with support from its contractor
SamKnows, the purpose of which is to establish a technical platform for the Measuring
Broadband America Program Fixed Broadband Testing and Measurement and further to use
that platform to collect data;
WHEREAS volunteer panelists have been recruited, and in so doing have agreed to provide
broadband performance information measured on their Whiteboxes to support the collection
of broadband performance data; and steps have been taken to protect the privacy of panelists
to the program’s effort to measure broadband performance. WE, THE UNDERSIGNED, as
participants and stakeholders in that Fixed Broadband Testing and Measurement, do hereby
agree to be bound by and conduct ourselves in accordance with the following principles and
shall:
Signatories: _____________________
Printed: ______________________
Date: _______________________
August 2013
Important Notice
Limitation of Liability
The information contained in this document is provided for general information purposes only.
While care has been taken in compiling the information herein, SamKnows does not warrant or
represent that this information is free from errors or omissions. To the maximum extent
permitted by law, SamKnows accepts no responsibility in respect of this document and any loss
or damage suffered or incurred by a person for any reason relying on the any of the information
provided in this document and for acting, or failing to act, on any information contained on or
referred to in this document.
Copyright
The material in this document is protected by Copyright.
33
Note that Measurement-Lab runs sidestream measurements for all TCP connections against their test nodes and
publishes these data in accordance with their data embargo policy.
SamKnows test nodes reside in major peering locations around the world. Test nodes are
carefully sited to ensure optimal connectivity on a market-by-market basis. SamKnows’ test
infrastructure utilizes nodes made available by Level3, Measurement-Lab and various network
operators, as well as under contract with select hosting providers.
Atlanta, Georgia
✓
Chicago, Illinois ✓ ✓
Dallas, Texas ✓ ✓
Miami, Florida ✓
Mountain View,
✓
California
34
In addition to the test nodes used to support the Measuring Broadband America Program, SamKnows utilizes a
diverse fleet of nodes in locations around the globe for other international programs.
Seattle, Washington ✓
Washington D.C ✓ ✓
Washington, Virginia ✓
Denver, Colorado ✓
SamKnows also has access to many test nodes donated by ISPs around the world. These particular
test nodes reside within individual ISP networks and are therefore considered on-net test nodes.
ISPs have the advantage of measuring to both on-net and off-net test nodes, which allows them
to segment end-to-end network performance and determine the performance of their own
network versus third party networks. For example, an ISP can see what impact third party
networks have on their end-users Quality of Experience (‘QoE’) by placing test nodes within their
own network and at major National and International peering locations.
Diagram 1 below shows this set-up.
Both the on-net and off-net test nodes are monitored by SamKnows as part of the global test
node fleet. Test node management is explained in more detail within the next section of this
document.
3 - Test Node Management
SamKnows test node infrastructure is a critical element of the SamKnows global measurement
platform and includes extensive monitoring in place. SamKnows uses a management tool to
control and configure the test nodes, while the platform is closely scrutinized using the Nagios
monitoring application. System alerts are also in place to ensure the test node infrastructure is
always available and operating well within expected threshold bounds.
The SamKnows Operations team continuously checks all test nodes to monitor capacity and
overall health. Also included is data analysis to safeguard data accuracy and integrity. This level
of oversight not only helps to maintain a healthy, robust platform but also allows us to spot and
flag actual network issues and events as they happen. Diagnostic information also supports the
Program managers’ decision-making process for managing the impact of data accuracy and
integrity incidents. This monitoring and administration is fully separate from any monitoring and
administration of operating systems and platforms that may be necessary by hosting entities with
which SamKnows may be engaged.
SamKnows maintains a standard specification for all test nodes to ensure consistency and
accuracy across the fleet.
SamKnows also has a policy of accepting test nodes provided by network operators providing
that
• The test node meets the specifications outlined earlier
Federal Communications Commission 72 Measuring Broadband America
Technical Appendix to the Ninth MBA Report
• Minimum of 1 Gbps upstream is provided and downstream connectivity to national
peering locations
Please note that donated test nodes may also be subject to additional local requirements.
Tenth
Measuring Broadband America
Fixed Broadband Report
A Report on Consumer Fixed Broadband Performance
in the United States
TABLE OF CONTENTS
2
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
List of Charts
Chart 1.1: Weighted average advertised download speed among the top 80% service tiers offered by each
ISP .......................................................................................................................................... 11
Chart 1.2: Weighted average advertised download speed among the DSL ISPs ........................................ 11
Chart 2: Weighted average advertised download speed among the top 80% service tiers based on
technology. ............................................................................................................................ 12
Chart 3: Consumer migration to higher advertised download speeds ....................................................... 13
Chart 4: The ratio of weighted median speed (download and upload) to advertised speed for each ISP. 14
Chart 5: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed ..................................... 15
Chart 6: The ratio of 80/80 consistent median download speed to advertised download speed. ............ 16
Chart 7: Latency by ISP................................................................................................................................ 17
Chart 8: Percentage of consumers whose peak-period packet loss was less than 0.4%, between 0.4% to
1%, and greater than 1%. ...................................................................................................... 18
Chart 9: Average webpage download time, by advertised download speed. ............................................ 19
Chart 10.1: Weighted average advertised upload speed among the top 80% service tiers offered by each
ISP. ......................................................................................................................................... 25
Chart 10.2: Weighted average advertised upload speed offered by ISPs using DSL technology. .............. 25
Chart 10.3: Weighted average advertised upload speed offered by ISPs using Cable technology. ........... 26
Chart 11: Weighted average advertised upload speed among the top 80% service tiers based on
technology. ............................................................................................................................ 27
Chart 12.1: The ratio of median download speed to advertised download speed. ................................... 28
Chart 12.2: The ratio of median upload speed to advertised upload speed. ............................................. 28
Chart 13: The percentage of consumers whose median upload speed was (a) greater than 95%, (b)
between 80% and 95%, or (c) less than 80% of the advertised upload speed. ..................... 29
Chart 14.1: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed. ................................................................................................................... 30
Chart 14.2: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed (continued). ................................................................................................ 30
Chart 14.3: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed, by technology. ........................................................................................... 31
Chart 14.4: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed. ........................................................................................................................ 32
3
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 14.5: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed (continued). ..................................................................................................... 32
Chart 14.6: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed, by technology. ................................................................................................ 33
Chart 15.1: The ratio of weighted median download speed to advertised download speed, peak hours
versus off-peak hours. ........................................................................................................... 33
Chart 15.2: The ratio of weighted median upload speed to advertised upload speed, peak versus off-peak.
............................................................................................................................................... 34
Chart 16: The ratio of median download speed to advertised download speed, Monday-to-Friday, two-
hour time blocks, terrestrial ISPs. .......................................................................................... 35
Chart 17.1: The ratio of 80/80 consistent upload speed to advertised upload speed. .............................. 36
Chart 17.2: The ratio of 70/70 consistent download speed to advertised download speed. .................... 37
Chart 17.3: The ratio of 70/70 consistent upload speed to advertised upload speed. .............................. 37
Chart 18: Latency for Terrestrial ISPs, by technology, and by advertised download speed....................... 38
Chart 19.1: The ratio of median download speed to advertised download speed, by ISP (1-5 Mbps). ..... 39
Chart 19.2: The ratio of median download speed to advertised download speed, by ISP (6-10 Mbps). ... 40
Chart 19.3: The ratio of median download speed to advertised download speed, by ISP (12-25 Mbps). . 41
Chart 19.4: The ratio of median download speed to advertised download speed, by ISP (30-60 Mbps). . 42
Chart 19.5: The ratio of median download speed to advertised download speed, by ISP (75-100Mbps). 43
Chart 19.6: The ratio of median download speed to advertised download speed, by ISP (150-200 Mbps).
............................................................................................................................................... 44
Chart 19.7: The ratio of median download speed to advertised download speed, by ISP (250-500 Mbps).
............................................................................................................................................... 45
Chart 20.1: The ratio of median upload speed to advertised upload speed, by ISP (0.768 - 1 Mbps). ...... 46
Chart 20.2: The ratio of median upload speed to advertised upload speed, by ISP (1.5-5 Mbps). ............ 47
Chart 20.3: The ratio of median upload speed to advertised upload speed, by ISP (10 -20 Mbps). .......... 48
Chart 20.4: The ratio of median upload speed to advertised upload speed, by ISP (30-75 Mbps). ........... 49
Chart 20.5: The ratio of median upload speed to advertised upload speed, by ISP (100–200 Mbps). ...... 50
Chart 21.1: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed, by service tier (DSL). .. 53
Chart 21.2: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed (cable). ........................ 54
Chart 21.3: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed (fiber). ......................... 55
4
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 22.1: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (DSL)................................. 55
Chart 22.2: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (cable). ............................. 56
Chart 22.3: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (fiber). .............................. 57
Chart 23.1: Average webpage download time, by ISP (1.5-5 Mbps). ......................................................... 59
Chart 23.2: Average webpage download time, by ISP (6-10 Mbps). .......................................................... 59
Chart 23.3: Average webpage download time, by ISP (12-25 Mbps). ........................................................ 60
Chart 23.4: Average webpage download time, by ISP (30-60Mbps). ......................................................... 60
Chart 23.5: Average webpage download time, by ISP (75 - 100 Mbps). .................................................... 61
Chart 23.6: Average webpage download time, by ISP (150 - 200 Mbps). .................................................. 62
Chart 23.7: Average webpage download time, by ISP (250 - 500 Mbps). .................................................. 63
List of Tables
Table 1: The most popular advertised service tiers .................................................................................... 10
Table 2: Peak Period Median download speed, by ISP ............................................................................... 50
Table 3: Complementary cumulative distribution of the ratio of median download speed to
advertised download speed by ISP .............................................................................................. 57
Table 4: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed by ISP ..................................................................................................................... 58
5
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
1. Executive Summary
The Tenth Measuring Broadband America Fixed Broadband Report (“Tenth Report” or “Report”) presents
perspectives on empirical performance for data collected in September and October 20191 from fixed
Internet Service Providers (ISPs), as part of the Federal Communication Commission’s (FCC) Measuring
Broadband America (MBA) program. This program is an ongoing, rigorous, nationwide study of consumer
broadband performance in the United States. The goal of this program is to measure the network
performance realized on a representative sample of service offerings and the residential broadband
consumer demographic across the country.2 This representative sample is referred to as the MBA ‘panel’.
Thousands of volunteer panelists are drawn from the subscriber bases of ISPs which collectively serve a
large percentage of the residential marketplace.3
The initial Measuring Broadband America Fixed Broadband Report was published in August 2011,4 and
presented the first broad-scale study of directly measured consumer broadband performance throughout
the United States. As part of an open data program, all methodologies used in the program are fully
documented, and all data collected is published for public use without any restrictions. Including this
current Report, ten reports have now been issued.5 These reports provide a snapshot of fixed broadband
Internet access service performance in the United States utilizing a comprehensive set of performance
metrics. The resulting performance data is analyzed in a variety of ways that has evolved to make the
information more understandable and useful.
A. MAJOR FINDINGS OF THE TENTH REPORT
The key findings of this report are:
• The maximum advertised download speeds amongst the service tiers offered by ISPs and measured
by the FCC ranged from 24 Mbps to 940 Mbps for the period covered by this report.
1
The actual dates used for measurements for this Tenth Report were September 6 – October 3, 2019 (inclusive) plus
October 8 – 9, 2019 (inclusive). An isolated server outage forced the exclusion of data from October 4 to 7 to avoid
anomalous results.
2
The sample is representative in that it aims to include those tiers that constitute the top 80% of the subscriber base
per ISP. Some tiers accordingly are not included. As with any sample, budget and sample constitution constraints
limit completeness of coverage.
3
At the request of and with the assistance of the State of Hawaii Department of Commerce and Consumer Affairs
(DCCA) the state of Hawaii was added to the MBA program in 2017. The ISPs whose performance were measured
in the State of Hawaii were Hawaiian Telcom and Oceanic Time Warner Cable (which is now a part of Charter
Spectrum).
4
All reports can be found at [Link]
5
The First Report (2011) was based on measurements taken in March 2011, the Second Report (2012) on
measurements taken in April 2012, and the Third (2013) through Ninth (2019) Reports on measurements taken in
September of the year prior to the reports’ release dates. In order to avoid confusion between the date of release
of the report and the measurement dates we have shifted last year to numbering the reports. Thus, this year’s
report is termed the Tenth MBA Report instead of the 2020 MBA Report. Going forward we will continue with a
numbered approach and the next report will be termed as the Eleventh Report.
6
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
• The weighted average advertised speed of the participating ISPs was 146.1 Mbps, representing an 8%
increase from the previous year (Ninth Report) and over 100% increase from two years prior (Eighth
Report).
• For most of the major broadband providers that were tested, measured download speeds were 100%
or better than advertised speeds during the peak hours (7 p.m. to 11 p.m. local time).
• Ten ISPs were evaluated in this report. Of these Cincinnati Bell and Frontier employed multiple
different broadband technologies across the USA. Overall 12 different ISP/technology configurations
were evaluated in this report and eight performed at or better than their advertised speed during the
peak hours. Only one performed below 90% for actual-to-advertised download speed during the peak
hours.
• In addition to providing download and upload speed measurements of ISPs, this report also provides
a measure of consistency of measured to advertised speeds of ISPs with the use of our “80/80” metric.
The 80/80 metric measures the percentage of the advertised speed that at least 80% of subscribers
experience at least 80% of the time over peak periods. Ten of the 12 ISP/technology configurations
provide better than 75% of advertised speed to at least 80% of panelists for at least 80% of the time.
These and other findings are described in greater detail within this report.
B. SPEED PERFORMANCE METRICS
Speed (both download and upload) performance continues to be one of the key metrics reported by the
MBA. The data presented includes ISP broadband performance as a median6 of speeds experienced by
panelists within a specific service tier. These reports mainly focus on common service tiers used by an
ISP’s subscribers.7
Additionally, consistent with previous Reports, we also compute average per-ISP performance by
weighting the median speed for each service tier by the number of subscribers in that tier. Similarly, in
calculating the composite average speed taking into account all ISPs in a specific year, the median speed
of each ISP is used and weighted by the number of subscribers of that ISP as a fraction of the total number
of subscribers across all ISPs.
In calculating these weighted medians, we draw on two sources for determining the number of
subscribers per service tier. ISPs may voluntarily contribute subscription demographics per surveyed
service tier as the most recent and authoritative data. Many ISPs have chosen to do so.8 When such
6
We first determine the mean value over all the measurements for each individual panelist’s “whitebox.” (Panelists
are sent “whiteboxes” that run pre-installed software on off-the-shelf routers that measure thirteen broadband
performance metrics, including download speed, upload speed, and latency.) Then for each ISP’s speed tiers, we
choose the median of the set of mean values for all the panelists/whiteboxes. The median is that value separating
the top half of values in a sample set with the lower half of values in that set; it can be thought of as the middle (i.e.,
most typical) value in an ordered list of values. For calculations involving multiple speed tiers, we compute the
weighted average of the medians for each tier. The weightings are based on the relative subscriber numbers for the
individual tiers.
7
Only tiers that contribute to the top 80% of an ISPs total subscribership are included in this report.
8
The ISPs that provided SamKnows, the FCC’s contractor supporting the MBA program, with weights for each of
their tiers were: Cincinnati Bell, CenturyLink, Charter, Comcast, Cox Frontier, Optimum, and Windstream.
7
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
information has not been provided by an ISP, we instead rely on the FCC’s Form 477 data.9 All facilities-
based broadband providers are required to file data with the FCC twice a year (Form 477) regarding
deployment of broadband services, including subscriber counts. For this report, we used the June 2019
Form 477 data. It should be noted that the Form 477 subscriber data values generally lag the reporting
month, and therefore, there are likely to be small inaccuracies in the tier ratios. It is for this reason that
we encourage ISPs to provide us with subscriber numbers for the measurement month.
As in our previous reports, we found that for most ISPs the actual speeds experienced by subscribers
either nearly met or exceeded advertised service tier speeds. However, since we started our MBA
program, consumers have changed their Internet usage habits. In 2011, consumers mainly browsed the
web and downloaded files; thus, we reported mean broadband speeds since these statistics were likely
to closely mirror user experience. By contrast, in September-October 2019 (the measurement period for
this report) consumer internet usage had become dominated by video consumption, with consumers
regularly streaming video for entertainment and education.10 Therefore, our network performance
analytics have been expanded by using consistency in service metrics to better capture the shift in usage
patterns. Both the median measured speed metric and consistency in service metrics help to better
reflect the consumer’s perception and usefulness of Internet access service.
Specifically, we use two kinds of metrics to reflect the consistency of service delivered to the consumer:
First, we report the percentage of advertised speed experienced by at least 80% of panelists during at
least 80% of the daily peak usage period (“80/80 consistent speed” measure). Second, we show the
fraction of consumers who obtain median speeds greater than 95%, between 80% and 95%, and less than
80% of advertised speeds.
A. USE OF OTHER PERFORMANCE METRICS
Although download and upload speeds remain the network performance metric of greatest interest to
the consumer, we also spotlight two other key network performance metrics in this report: latency and
packet loss. These metrics can significantly affect the overall quality of Internet applications.
Latency is the time it takes for a data packet to travel across a network from one point on the network to
another. High latencies may affect the perceived quality of some interactive services such as phone calls
over the Internet, video chat and video conferencing, or online multiplayer games. All network access
technologies have a minimum latency that is largely determined by the technology There are many other
factors that affect latency though, including the location of the server you're communicating with, the
route taken to the server, and whether or not there is any congestion on that route. Technology-
9
For an explanation of Form 477 filing requirements and required data see:
[Link] (Last accessed 8/10/2020).
10
“It is important to track the changing mix of devices and connections and growth in multidevice ownership as it
affects traffic patterns. Video devices, in particular, can have a multiplier effect on traffic. An Internet-enabled HD
television that draws couple - three hours of content per day from the Internet would generate as much Internet
traffic as an entire household today, on an average. Video effect of the devices on traffic is more pronounced
because of the introduction of Ultra-High-Definition (UHD), or 4K, video streaming. This technology has such an
effect because the bit rate for 4K video at about 15 to 18 Mbps is more than double the HD video bit rate and nine
times more than Standard-Definition (SD) video bit rate. We estimate that by 2023, two-thirds (66 percent) of the
installed flat-panel TV sets will be UHD, up from 33 percent in 2018” See Cisco Annual Internet Report (2018-2023)
White Paper , [Link]
vni/[Link] (Last accessed Aug. 8, 2020).
8
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
dependent latencies are typically small for terrestrial broadband services and are thus unlikely to affect
the perceived quality of applications. Additionally, for certain applications the user experience is not
necessarily affected by high latencies. As an example, when using entertainment video streaming
applications, because the data can be cached prior to display, the user experience is likely to be unaffected
by relatively high latencies
Packet loss measures the fraction of data packets sent that fail to be delivered to the intended destination.
Packet loss may affect the perceived quality of applications that do not incorporate retransmission of lost
packets, such as phone calls over the Internet, video chat, some online multiplayer games, and some video
streaming. High packet loss also degrades the achievable throughput of download and streaming
applications. However, packet loss of a few tenths of a percent are unlikely to significantly affect the
perceived quality of most Internet applications and are common. During network congestion, both
latency and packet loss typically increase.
The Internet continually evolves in its architecture, performance, and services. Accordingly, we will
continue to adapt our measurement and analysis methodologies to further improve the collective
understanding of performance characteristics of broadband Internet access. By doing so we aim to help
the community of interest across the board, from consumers to technologists, service providers and
regulators.
9
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Tech-
Company Speed Tiers (Download) Speed Tiers (Upload)
nology
CenturyLink 1.5 3 7 8* 10 12 20 25* 40 0.512* 0.768 0.896 2 5 10*
Cincinnati Bell DSL 5 30* 0.768 3*
DSL
Frontier DSL 3 6 12 24* 0.768 1 1.5*
Windstream 3 6 10 12* 15* 25 50* 100* 0.768* 1 1.5 4*
Altice Optimum 100 200 300* 35
Charter 100 200 400 10 20
Cable Comcast 60 150 250 5 10
Cox 30 100* 150* 300 3 10 30
Mediacom 60 100 200 5 10 20
Cincinnati Bell Fiber 50 250 500 10 100 125
Fiber Frontier Fiber 50 75 100 150 200 50 75 100 150 200
Verizon Fiber 50* 75 100 940** 50* 75 100 880**
*Tiers that lack sufficient panelists to meet the program’s target sample size.
** Although Verizon Fiber’s 940/880 Mbps service tier was amongst the top 80% of Verizon’s offered
tiers by subscription numbers, it is not included in the report charts because technical methodologies for
measuring high speed rates near Gigabit and above have not yet been established for the MBA program.
Chart 1.1 (below) displays the weighted (by subscriber numbers) mean of the top 80% advertised
download speed tiers for each participating ISP for the last three years (September 2017 to September-
October 2019) grouped by the access technology used to offer the broadband Internet access service (DSL,
cable, or fiber). It should be noted that this chart does not reflect the actual performance of the ISPs and
only provides the weighted average of the ISP’s advertised speeds. In September-October 2019, the
weighted average advertised download speed was 146.1 Mbps among the measured ISPs, which
represents a 100% increase from 2017 and a 8% increase compared to the average in September-October
2018 which was 135.7 Mbps.11
11
Please note that this average for September-October 2018 and September 2017 represents the average advertised
download speed with AT&T tiers removed. We did this to have a fairer comparison between the years since AT&T
is no longer an active participant in the MBA program. The actual weighted average advertised download speed
(with AT&T included) for September-October 2018, as reported in the Ninth MBA Report is 123.3 Mbps.
10
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 1.1: Weighted average advertised download speed among the top 80% service tiers offered by each
ISP
All of the ISPs, except Verizon, showed higher weighted-averages of advertised speeds in September-
October 2019 as compared to September 2018. Verizon-fiber showed a slight decrease in 2019 compared
to 2018 which was not due to any reduction in the service speed offerings but arose from changes in
weighting due to relative shifts in subscriber numbers on the advertised tiers from 2018 to 2019.
It can be seen from Chart 1.1 that the DSL speeds lag far behind the speed of other technologies. In order
to better compare the DSL speed offerings by the various ISPs we have added a separate Chart 1.2 drawn
to a scale that makes their relative speeds more discernable.
Chart 2.2: Weighted average advertised download speed among the DSL ISPs
Among participating broadband ISPs, only Cincinnati Bell, Frontier, and Verizon use fiber as the access
technology for a substantial number of their customers and their maximum speed offerings range from
200 Mbps to 940 Mbps. A key difference between the fiber vendors and other technology vendors is that
(with the exception of Cincinnati Bell), most fiber vendors advertise generally symmetric upload and
download speeds. This is in sharp contrast to the asymmetric offerings for all the other technologies,
where the upload advertised speeds are typically 5 to 10 times below the download advertised speeds.
11
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
As can be seen in Chart 1.1, there is considerable difference between the offered average weighted speed
tier by technology. Chart 2 plots the weighted average of the top 80% ISP tiers by technology for the last
three years.12 As can be seen in this chart, most technologies showed increases in the set of advertised
download speeds by ISPs. For the September-October 2019 period, the weighted mean advertised speeds
for DSL technology was 13 Mbps which lagged considerably behind the weighted mean advertised
download speeds for cable and fiber technologies, which were 155 Mbps and 208 Mbps, respectively.
Fiber technology showed the greatest increase in speed offerings in 2019 compared to 2017 with a
weighted mean going up from 70 Mbps to 208 Mbps representing a nearly 200% increase. This year’s
(2019) average advertised speed for fiber, however, showed a slight decrease by 17% from last year’s
(2018) speed. DSL technology speed increased from 11 Mbps to 13 Mbps from 2017 to 2019, a 16%
increase overall (though it did show a small 1% decrease in speed this year compared to last year). In
comparison, cable technology showed an 12% increase from 2018 to 2019 and an overall 83% increase
from 2017 to 2019
Chart 2: Weighted average advertised download speed among the top 80% service tiers based on
technology.
Chart 3 plots the migration of panelists to a higher service tier based on their access technology.13
Specifically, the horizontal axis of Chart 3 partitions the September 2018 panelists by the advertised
download speed of the service tier to which they were subscribed. For each such set of panelists who
12
Since AT&T is no longer actively participating in the Measuring Broadband America program, we have removed it
from previous years’ results in Charts 1 and 2. This allows a proper comparison to be made between the results for
this year as compared to previous years. It should also be noted that although AT&T IPBB had been characterized
in previous reports as a DSL technology it actually included a mix of ADSL2+, VDSL2, [Link] and Ethernet technologies
delivered over a hybrid of fiber optic and copper facilities.
13
Where several technologies are plotted at the same point in the chart, this is identified as “Multiple Technologies.”
12
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
also participated in the September-October 2019 collection of data,14 the vertical axis of Chart 3 displays
the percentage of panelists that migrated by September-October 2019 to a service tier with a higher
advertised download speed. There are two ways that such a migration could occur: (1) if a panelist
changed their broadband plan during the intervening year to a service tier with a higher advertised
download speed, or (2) if a panelist did not change their broadband plan but the panelist’s ISP increased
the advertised download speed of the panelist’s subscribed plan.15
Chart 3 shows that the percentage of panelists subscribed in September-October 2018 who moved to
higher tiers in September-October 2019 was between 3% to 26% for DSL subscribers, 4% to 100% for cable
subscribers and 16% to 50% for fiber subscribers. In addition, 1% to 8% subscribers migrated to a higher
speed tier using a different technology from what they had in September 2018.
14
Of the 5,855 panelists who participated in the September 2018 collection of data, 4,246 panelists continued to
participate in the September-October 2019 collection of data.
15
We do not attempt here to distinguish between these two cases.
13
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
advertised speeds. However, DSL broadband ISPs continue to advertise “up-to” speeds that on average
exceed the actual speeds experienced by their subscribers. Out of the 12 ISP/technology configurations
shown, 8 met or exceeded their advertised download speed and three more reached at least 90% of their
advertised download speed. Only Cincinnati-DSL (at 79%) performed below 90% of its advertised
download speed.
Chart 4: The ratio of weighted median speed (download and upload) to advertised speed for each ISP.
C. VARIATIONS IN SPEEDS
As discussed earlier, actual speeds experienced by individual consumers may vary by location and time of
day. Chart 5 shows, for each ISP, the percentage of panelists who experienced a median download speed
(averaged over the peak usage period during our measurement period) that was greater than 95%,
between 80% and 95%, or less than 80% of the advertised download speed.
14
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 5: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed
ISPs using DSL technology had between 2% to 69% of their subscribers getting greater than or equal to
95% of their advertised download speeds during peak hours. ISPs using cable technology and fiber
technology had between 93%-99% and between 65%-97%, respectively, of their subscribers getting equal
to or better than 95% of their advertised download speeds.
Though the median download speeds experienced by most ISPs’ subscribers nearly met or exceeded the
advertised download speeds, there are some customers of each ISP for whom the median download
speed fell significantly short of the advertised download speed. Relatively few subscribers of cable service
experienced this. The best performing ISPs, when measured by this metric, are Charter, Comcast, Cox,
Mediacom, Optimum, Frontier-Fiber and Verizon-Fiber; more than 80% of their panelists were able to
attain an actual median download speed of at least 95% of the advertised download speed.
In addition to variations based on a subscriber’s location, speeds experienced by a consumer may
fluctuate during the day. This is typically caused by increased traffic demand and the resulting stress on
different parts of the network infrastructure. To examine this aspect of performance, we use the term
“80/80 consistent speed.” This metric is designed to assess temporal and spatial variations in measured
values of a user’s download speed.16 While consistency of speed is in itself an intrinsically valuable service
characteristic, its impact on consumers will hinge on variations in usage patterns and needs. As an
example, a good consistency of speed measure is likely to indicate a higher quality of service experience
for internet users consuming video content.
Chart 6 summarizes, for each ISP, the ratio of 80/80 consistent median download speed to advertised
download speed, and, for comparison, the ratio of median download speed to advertised download speed
shown previously in Chart 4. The ratio of 80/80 consistent median download speed to advertised
download speed is less than the ratio of median download speed to advertised download speed for all
participating ISPs due to congestion periods when median download speeds are lower than the overall
average. When the difference between the two ratios is small, the median download speed is fairly
insensitive to both geography and time. When the difference between the two ratios is large, there is a
16
For a detailed definition and discussion of this metric, please refer to the Technical Appendix.
15
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
greater variability in median download speed, either across a set of different locations or across different
times during the peak usage period at the same location.
Chart 6: The ratio of 80/80 consistent median download speed to advertised download speed.
Customers of Charter, Comcast, Cox, Mediacom and Optimumexperienced median download speeds that
were very consistent; i.e., they provided greater than 100% of the advertised speed during peak usage
period to more than 80% of panelists for more than 80% of the time. As can be seen in Chart 6 cable and
fiber ISPs performed better than DSL ISPs with respect to their 80/80 consistent speeds. For example, for
September-October 2019, the 80/80 consistent download speed for Cincinnati Bell DSL was 46% of the
advertised speed.
D. LATENCY
The latency between any two points in the network is the time it takes for a packet to travel from one
point to the other. It has a fixed component that depends on the distance, the transmission speed, and
transmission technology between the source and destination, and a variable component due to queuing
delay that increases as the network path congests with traffic. The MBA program measures latency by
measuring the round-trip time between the consumer’s home and the closest measurement server.
Chart 7 shows the median latency for each participating ISP. In general, higher-speed service tiers have
lower latency, as it takes less time to transmit each packet. The median latencies ranged from 10 ms to
27 ms in our measurements (with the exception of CenturyLink DSL and Cincinnati Bell DSL which had
median latencies of 40 ms and 34 ms, respectively).
16
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
DSL latencies (between 11 ms to 40 ms) were slightly higher than those for cable (13 ms to 27 ms). Fiber
ISPs showed the lowest latencies (10 ms to 12 ms). The differences in median latencies among terrestrial-
based broadband services are relatively small and are unlikely to affect the perceived quality of highly
interactive applications.
E. PACKET LOSS
Packet loss is the percentage of packets that are sent by a source but not received at the intended
destination. The most common causes of packet loss are congestion leading to buffer overflows or active
queue management along the network path. Alternatively, high latency might lead to a packet being
counted as lost if it does not arrive within a specified window. A small amount of packet loss is expected,
and indeed packet loss is commonly used by some Internet protocols such as TCP to infer Internet
congestion and to adjust the sending rate to mitigate the offered load, thus lessening the contribution to
congestion and the risk of lost packets. The MBA program uses an active UDP-based packet loss
measurement method and considers a packet lost if it is not returned within 3 seconds.
Chart 8 shows the average peak-period packet loss for each participating ISP, grouped into bins. We have
broken the packet loss performance into three bands, allowing a more granular view of the packet loss
performance of the ISP network. The breakpoints for the three bins used to classify packet loss have been
chosen with an eye towards balancing commonly accepted packet loss thresholds for specific services and
provider packet loss Service Level Agreements (SLAs) for enterprise services, as consumer offerings are
not typically accompanied by SLAs. Specifically, the 1% standard for packet loss is commonly accepted as
the point at which highly interactive applications such as VoIP experience significant degradation in quality
according to industry publications and international (ITU) standards.17 The 0.4% breakpoint was chosen
as middle ground between the highly desirable performance of 0% packet loss described in many
documents (for Voice over Internet Protocol (VoIP)) and the 1% unacceptable limit on the high side. The
specific value of 0.4% is also generally supported by major ISP SLAs for network performance. Indeed,
17
See: [Link]
17
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
most SLAs support 0.1% to 0.3% packet loss guarantees,18 but these are generally for enterprise level
services which entail business-critical applications that require some service guarantees.
Chart 8: Percentage of consumers whose peak-period packet loss was less than 0.4%, between 0.4% to
1%, and greater than 1%.
Chart 8 shows that ISPs using fiber technology have the lowest packet loss, and that ISPs using DSL
technology tend to have the highest packet loss. As shown in this chart, 6% to 21% of DSL subscribers
experience 1% or greater packet loss. The corresponding numbers for cable and fiber are 0% to 5% and
0% to 1.5%, respectively. Within a given technology class, packet loss also varies among ISPs.
F. WEB BROWSING PERFORMANCE
The MBA program also conducts a specific test to gauge web browsing performance. The web browsing
test accesses nine popular websites that include text and images, but not streaming video. The time
required to download a webpage depends on many factors, including the consumer’s in-home network,
the download speed within an ISP’s network, the web server’s speed, congestion in other networks
outside the consumer’s ISP’s network (if any), and the time required to look up the network address of
the webserver. Only some of these factors are under control of the consumer’s ISP. Chart 9 displays the
average webpage download time as a function of the advertised download speed. As shown by this chart,
webpage download time decreases as download speed increases, from about 9.8 seconds at 1.5 Mbps
download speed to about 1.5 seconds for 25 Mbps download speed. Subscribers to service tiers exceeding
25 Mbps experience slightly smaller webpage download times decreasing to 1 – 1.25 seconds at 150 Mbps.
Beyond 150 Mbps, the webpage download times decrease only by minor amounts. These download times
assume that only a single user is using the Internet connection when the webpage is downloaded, and
does not account for more common scenarios, where multiple users within a household are
simultaneously using the Internet connection for viewing web pages, as well as other applications such as
real-time gaming or video streaming.
18
See: [Link] and [Link]
18
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
19
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
3. Methodology
A. PARTICIPANTS
Ten ISPs actively participated in the Fixed MBA program in September-October 2019.19 They were:
• CenturyLink
• Charter Communications
• Cincinnati Bell
• Comcast
• Cox Communications
• Frontier Communications Company
• Mediacom Communications Corporation
• Optimum
• Verizon
• Windstream Communications
The methodologies and assumptions underlying the measurements described in this Report are reviewed
at meetings that are open to all interested parties and documented in public ex parte letters filed in the
GN Docket No. 12-264. Policy decisions regarding the MBA program were discussed at these meetings
prior to adoption, and involved issues such as inclusion of tiers, test periods, mitigation of operational
issues affecting the measurement infrastructure, and terms-of-use notifications to panelists. Participation
in the MBA program is open and voluntary. Participants include members of academia, consumer
equipment vendors, telecommunications vendors, network service providers, consumer policy groups, as
well as our contractor for this project, SamKnows. In 2019-2020, participants at these meetings
(collectively and informally referred to as “the broadband collaborative”), included all eleven participating
ISPs and the following additional organizations:
• Level 3 Communications (“Level 3”), now part of CenturyLink
• Massachusetts Institute of Technology (“MIT”)
• Measurement Lab (M-Lab)
• StackPath
• NCTA – The Internet & Television Association (“NCTA”)
• New America Foundation
• Princeton University
• United States Telecom Association (“US Telecom”)
• University of California - Santa Cruz
Participants have contributed in important ways to the integrity of this program and have provided
valuable input to FCC decisions for this program. Initial proposals for test metrics and testing platforms
were discussed and critiqued within the broadband collaborative. M-Lab and Level 3 contributed their
core network testing infrastructure, and both parties continue to provide invaluable assistance in helping
to define and implement the FCC testing platform. We thank all the participants for their continued
contributions to the MBA program.
19
While Hawaiian Telcom participated in the Fixed MBA program, we did not report on it since we did not have
sufficient number of panelists on Hawaiian Telcom tiers to have a statistically valid dataset.
20
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
B. MEASUREMENT PROCESS
The measurements that provided the underlying data for this report were conducted between MBA
measurement clients and MBA measurement servers. The measurement clients (i.e., whiteboxes) were
situated in the homes of 6,006 panelists each of whom received service from one of the 11 evaluated ISPs.
The evaluated ISPs collectively accounted for over 80% of U.S. residential broadband Internet
connections. After the measurement data was processed (as described in greater detail in the Technical
Appendix), test results from 3,075 panelists were used in this report.
The measurement servers used by the MBA program were hosted by StackPath, M-Lab, and Level 3
Communications, and were located in thirteen cities (often with multiple locations within each city) across
the United States near a point of interconnection between the ISP’s network and the network on which
the measurement server resided.
The measurement clients collected data throughout the year, and this data is available as described
below. However, only data collected from September 6 – October 3, 2019 (inclusive) plus October 8 – 9,
2019 (inclusive), referred to throughout this report as the “September-October 2019” reporting period,
were used to generate the charts in this Report.20
Broadband performance varies with the time of day. At peak hours, more people tend to use their
broadband Internet connections, giving rise to a greater potential for network congestion and degraded
user performance. Unless otherwise stated, this Report focuses on performance during peak usage
period, which is defined as weeknights between 7:00 p.m. to 11:00 p.m. local time at the subscriber’s
location. Focusing on peak usage period provides the most useful information because it demonstrates
what performance users can expect when the Internet in their local area experiences the highest demand
from users.
Our methodology focuses on the network performance of each of the participating ISPs. The metrics
discussed in this Report are derived from active measurements, i.e., test-generated traffic flowing
between a measurement client, located within the modem/router within a panelist’s home, and a
measurement server, located outside the ISP’s network. For each panelist, the tests automatically choose
the measurement server that has the lowest latency to the measurement client. Thus, the metrics
measure performance along the path followed by the measurement traffic within each ISP’s network,
through a point of interconnection between the ISP’s network and the network on which the chosen
measurement server is located. However, the service performance that a consumer experiences could
differ from our measured values for several reasons.
First, as noted, in the course of each test instance we measure performance only to a single measurement
server rather than to multiple servers. This is consistent with the approach chosen by most network
measurement tools. As a point of comparison, the average web page may load its content from a
multiplicity of end points.
20
This proposed time period avoids the dates in early September when parts of North Carolina and Florida were
affected by Hurricanes Florence and Michael. It also avoided the increased traffic resulting from latest iOS release
which also took place in early September. Omitting dates during these periods was done consistent with the FCC’s
data collection policy for fixed MBA data. See FCC, Measuring Fixed Broadband, Data Collection Policy,
[Link] (explaining that the FCC
has developed policies to deal with impairments in the data collection process with potential impact for the
validity of the data collected).
21
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
In addition, bottlenecks or congestion points in the full path traversed by consumer application traffic
might also impact a consumer’s perception of Internet service performance. These bottlenecks may exist
at various points: within the ISP’s network, beyond its network (depending on the network topology
encountered en route to the traffic destination), in the consumer’s home, on the Wi-Fi used to access the
in-home access router, or from a shortfall of capacity at the far end point being accessed by the
application. The MBA tests explore how a service performs from the point at which a fixed ISP’s Internet
service is delivered to the home on fixed infrastructure (deliberately excluding Wi-Fi, due to the many
confounding factors associated with it) to the point at which the test servers are located. As MBA tests
are designed to focus on the access to the ISP’s network, they will not include phenomena at most
interconnection points or transit networks that consumer traffic may traverse.
To the extent possible21 the MBA focuses on performance within an ISP’s network. It should be noted
that the overall performance a consumer experiences with their service can also be affected by congestion
such as may arise at other points in the path potentially taken by consumer traffic (e.g., in-home Wi-Fi,
peering points, transit networks, etc.) but this does not get reflected in MBA measurements.
A consumer’s home network, rather than the ISP’s network, may be the bottleneck with respect to
network congestion. We measure the performance of the ISP’s service delivered to the consumer’s home
network, but this service is often shared simultaneously among multiple users and applications within the
home. In-home networks, which typically include Wi-Fi, may not have sufficient capacities to support
peak loads.22
In addition, consumers’ experience of ISP performance is manifested through the set of applications they
utilize. The overall performance of an application depends not only on the network performance (i.e.,
raw speed, latency, or packet loss), but also on the application’s architecture and implementation and on
the operating system and hardware on which it runs. While network performance is considered in this
Report, application performance is generally not.
C. MEASUREMENT TESTS AND PERFORMANCE METRICS
This Report is based on the following measurement tests:
• Download speed: This test measures the download speed of each whitebox over a 10-second
period, once per hour during peak hours (7 p.m. to 11 p.m.) and once during each of the following
periods: midnight to 6 a.m., 6 a.m. to noon, and noon to 6 p.m. The download speed
measurement results from each whitebox are then averaged across the measurement month;
21
The MBA program uses test servers that are both neutral (i.e., operated by third parties that are not ISP-operated
or owned) and located as close as practical, in terms of network topology, to the boundaries of the ISP networks
under study. As described earlier in this section, a maximum of two interconnection points and one transit network
may be on the test path. If there is congestion on such paths to the test server, it may impact the measurement,
but the cases where it does so are detectable by the test approach followed by the MBA program, which uses
consistent longitudinal measurements, comparisons with control servers located on-net and trend analyses of
averaged results. Details of the methodology used in the MBA program are given in the Technical Appendix to this
report.
22
Independent research, drawing on the FCC’s MBA test platform, suggests that home networks are a significant
source of end-to-end service congestion. See Srikanth Sundaresan et al., Home Network or Access Link? Locating
Last-Mile Downstream Throughput Bottlenecks, PAM 2016 - Passive and Active Measurement Conference, at 111-
123 (Mar. 2016). Numerous instances of research supported by the fixed MBA test platform are described at
[Link]
22
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
and the median value for these average speeds across the entire set of whiteboxes on a given tier
is used to determine the median measured download speed for that tier. The overall ISP
download speed is computed as the weighted median for each service tier, using the subscriber
counts for the tiers as weights.
• Upload speed: This test measures the upload speed of each whitebox over a 10-second period,
which is the same measurement interval as the download speed. The upload speed measured in
the last five seconds of the 10-second interval is retained, the results of each whitebox are then
averaged over the measurement period, and the median value for the average speed taken over
the entire set of whiteboxes is used to determine the median upload speed for a service tier. The
ISP upload speed is computed in the same manner as the download speed.
• Latency and packet loss: These tests measure the round-trip times for approximately 2,000
packets per hour sent at randomly distributed intervals. Response times less than three seconds
are used to determine the mean latency. If the whitebox does not receive a response within three
seconds, the packet is counted as lost.
• Web browsing: The web browsing test measures the total time it takes to request and receive
webpages, including the text and images, from nine popular websites and is performed once every
hour. The measurement includes the time required to translate the web server name (URL) into
the webserver’s network (IP) address.
This Report focuses on three key performance metrics of interest to consumers of broadband Internet
access service, as they are likely to influence how well a wide range of consumer applications work:
download and upload speed, latency, and packet loss. Download and upload speeds are also the primary
network performance characteristic advertised by ISPs. However, as discussed above, the performance
observed by a user in any given circumstance depends not only on the actual speed of the ISP’s network,
but also on the performance of other parts of the Internet and on that of the application itself.
The standard speed tests use TCP with 8 concurrent TCP sessions. In 2017 we also introduced a less-data
intensive throughput test, which both generated less traffic and ran less frequently and thereby provided
less strain on consumer accounts that are data-capped. The Lightweight tests are used exclusively to
provide broadband performance results for satellite ISPs. The Technical Appendix to this Report describes
each test in more detail, including additional tests not contained in this Report.
D. AVAILABILITY OF DATA
The MBA panel sample used in the reporting period is validated (i.e., upload and download tiers of the
whiteboxes are verified with providers) and the measurement results are carefully inspected to eliminate
misleading outliers. This leads to a ‘validated data set’ that accompanies each report. The Validated Data
Set23 on which this Report is based, as well as the full results of all tests, are available at
[Link] For interested parties, as tests are run 24x7x365, we
also provide raw data (referred to as such because cross-checks are not done except in the test period
used for the report, thus subscriber tier changes may be missed) for the reference month and other
months. Previous reports of the MBA program, as well as the data used to produce them, are also
available there.
Both the Commission and SamKnows, the Commission’s contractor for this program, recognize that, while
the methodology descriptions included in this document provide an overview of the project, interested
23
The September-October 2019 data set was validated to remove anomalies that would have produced errors in the
Report. This data validation process is described in the Technical Appendix.
23
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
parties may be willing to contribute to the project by reviewing the software used in the testing.
SamKnows welcomes review of its software and technical platform, consistent with the Commission’s
goals of openness and transparency for this program.24
24
The software that was used for the MBA program will be made available for noncommercial purposes. To apply
for noncommercial review of the code, interested parties may contact SamKnows directly at team@[Link],
with the subject heading “Academic Code Review.”
24
Tenth Measuring Broadband America Fixed Broadband Report
4. Test Results
A. MOST POPULAR ADVERTISED SERVICE TIERS
Chart 1 above summarizes the weighted average of the advertised download speeds25 for each
participating ISP, for the last 3 years (September 2017 to September-October 2019) where the weighting
is based upon the number of subscribers to each tier, grouped by the access technology used to offer the
broadband Internet access service (DSL, cable, or fiber). Only the top 80% tiers (by subscriber number) of
each ISP were included. Chart 10 below shows the corresponding weighted average of the advertised
upload speeds among the measured ISPs. The computed weighted average of the advertised upload
speed of all the ISPs is 30.5 Mbps representing a 133% increase compared to 13.1 Mbps in 2017. However,
the computed average weighted upload speed decreased slightly this year by 4% over the previous year’s
value of 31.9 Mbps.26
Chart 10.1: Weighted average advertised upload speed among the top 80% service tiers offered by each
ISP.
Due to the relatively high upload speeds for optical technology, it is difficult to discern the variations in
speed for both DSL and cable technologies when drawn to the same scale. Separate Charts 10.1 and 10.2
are included here that provide the weighted-average upload speeds for ISPs using DSL and cable
technologies, respectively.
Chart 10.2: Weighted average advertised upload speed offered by ISPs using DSL technology.
25
Measured service tiers were tiers which constituted the top 80% of an ISP’s broadband subscriber base.
26
Please note that this average for Sept-Oct 2018 represents the average advertised upload speed with AT&T tiers
removed. We did this to have a fairer comparison between the years since AT&T is no longer an active participant
in the MBA program. The actual weighted average upload speed for September-October 2018, as reported in the
Ninth MBA Report, is 27.4 Mbps.
25
Tenth Measuring Broadband America Fixed Broadband Report
Chart 10.3: Weighted average advertised upload speed offered by ISPs using Cable technology.
Chart 11 compares the weighted average of the advertised upload speeds by technology for the last 3
years (September 2017 to September-October 2019). As can be seen in this chart, all technologies showed
increased rates in 2019 as compared to 2017. However, the rates of increase were not the same for all
technologies. The rate of increase in the weighted average of Fiber technology was 189% compared to
DSL and Cable which were 11% and 43%, respectively. Comparing the 2019 results with the previous
year’s (2018) results, we see an increase of offered upload speeds in DSL by 6% to 1.5 Mbps and an
increase in cable of 9% to 11 Mbps. However, Fiber upload speed decreased by 29% in 2019 as compared
with 2018. This drop in fiber upload speed is due to relative shifts in the number of subscribers to the
tiers rather than lowering of offered upload tier speeds. Despite this drop, the advertised fiber upload
speeds (194 Mbps) were still far higher than for other technologies.
Observing both the download and upload speeds, it is clear that fiber service tiers are generally symmetric
in their actual upload and download speeds. This results from the fact that fiber technology has
26
Tenth Measuring Broadband America Fixed Broadband Report
significantly more capacity than other technologies and it can be engineered to have symmetric upload
and download speeds. For other technologies with more limited capacity, higher capacity is usually
allocated to download speeds than to upload speeds, typically in ratios ranging from 5:1 to 10:1. This
resulting asymmetry in download/upload speeds is reflective of actual usage because consumers typically
download significantly more data than they upload.
Chart 11: Weighted average advertised upload speed among the top 80% service tiers based on
technology.
27
Tenth Measuring Broadband America Fixed Broadband Report
Chart 12.1: The ratio of median download speed to advertised download speed.
Chart 12.2 shows the median upload speed as a percentage of the advertised speed. As was the case with
download speeds most ISPs met or exceeded the advertised rates except for a number of DSL providers:
CenturyLink, Cincinnati Bell DSL, Frontier DSL and Windstream which had respective ratios of 87%, 77%,
90%, and 91%.
Chart 12.2: The ratio of median upload speed to advertised upload speed.
C. VARIATIONS IN SPEEDS
Median speeds experienced by consumers may vary based on location and time of day as the network
architectures and traffic patterns may differ. Chart 5 in Section 2 above showed, for each ISP, the
percentage of consumers (across the ISP’s service territory) who experienced a median download speed
over the peak usage period that was either greater than 95%, between 80% and 95%, or less than 80% of
the advertised download speed. Chart 13 below shows the corresponding percentage of consumers
whose median upload speed fell in each of these ranges. ISPs using DSL technology had only between 0%
to 36% of their subscribers getting greater than or equal to 95% of their advertised upload speeds during
peak hours. In contrast, ISPs using cable or fiber technology had between 92% - 100% of their subscribers
getting equal to or better than 95% of their advertised upload speeds.
28
Tenth Measuring Broadband America Fixed Broadband Report
Chart 13: The percentage of consumers whose median upload speed was (a) greater than 95%, (b) between
80% and 95%, or (c) less than 80% of the advertised upload speed.
Though the median upload speeds experienced by most subscribers were close to or exceeded the
advertised upload speeds there were some subscribers, for each ISP, whose median upload speed fell
significantly short of the advertised upload speed. This issue was most prevalent for ISPs using DSL
technology. On the other hand, ISPs using cable and fiber technology generally showed very good
consistency based on this metric.
We can learn more about the variation in network performance by separately examining variations across
geography and across time. We start by examining the variation across geography within each
participating ISP’s service territory. For each ISP, we first calculate the ratio of the median download
speed (over the peak usage period) to the advertised download speed for each panelist subscribing to
that ISP. We then examine the distribution of this ratio across the ISP’s service territory.
Charts 14.1 and 14.2 show the complementary cumulative distribution of the ratio of median download
speed (over the peak usage period) to advertised download speed for each participating ISP. For each
ratio of actual to advertised download speed on the horizontal axis, the curves show the percentage of
panelists subscribing to each ISP that experienced at least this ratio.27 For example, the Cincinnati Bell
fiber curve in Chart 14.1 shows that 90% of its subscribers experienced a median download speed
exceeding 76% of the advertised download speed, while 70% experienced a median download speed
exceeding 92% of the advertised download speed, and 50% experienced a median download speed
exceeding 107% of the advertised download speed.
27
In Reports prior to the 2015 MBA Report, for each ratio of actual to advertised download speed on the horizontal
axis, the cumulative distribution function curves showed the percentage of measurements, rather than panelists
subscribing to each ISP, that experienced at least this ratio. The methodology used since then, i.e., using panelists
subscribing to each ISP, more accurately illustrates ISP performance from a consumer’s point of view.
29
Tenth Measuring Broadband America Fixed Broadband Report
Chart 14.1: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed.
Chart 14.2: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed (continued).
The curves for cable-based broadband and fiber-based broadband are steeper than those for DSL-based
broadband. This can be seen more clearly in Chart 14.3, which plots aggregate curves for each technology.
Approximately 90% of subscribers to cable and 50% of subscribers to fiber-based technologies experience
30
Tenth Measuring Broadband America Fixed Broadband Report
median download speeds exceeding the advertised download speed. In contrast, less than 30% of
subscribers to DSL-based services experience median download speeds exceeding the advertised
download speed.28
Chart 14.3: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed, by technology.
Charts 14.4 to 14.6 show the complementary cumulative distribution of the ratio of median upload speed
(over the peak usage period) to advertised upload speed for each participating ISP (Charts 14.4 and 14.5)
and by access technology (Chart 14.6).
28
The speed achievable by DSL depends on the distance between the subscriber and the central office. Thus, the
complementary cumulative distribution function will fall slowly unless the broadband ISP adjusts its advertised rate
based on the subscriber’s location. (Chart 16 illustrates that the performance during non-busy hours is similar to
the busy hour, making congestion less likely as an explanation.)
31
Tenth Measuring Broadband America Fixed Broadband Report
Chart 14.4: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed.
Chart 14.5: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed (continued).
32
Tenth Measuring Broadband America Fixed Broadband Report
Chart 14.6: Complementary cumulative distribution of the ratio of median upload speed to advertised
upload speed, by technology.
All actual speeds discussed above were measured during peak usage periods. In contrast, Charts 15.1 and
15.2 below compare the ratio of actual download and upload speeds to advertised download and upload
speeds during peak and off-peak times. Charts 15.1 and 15.2 show that most ISP subscribers experience
only a slight degradation from off-peak to peak hour performance.
Chart 15.1: The ratio of weighted median download speed to advertised download speed, peak hours
versus off-peak hours.
33
Tenth Measuring Broadband America Fixed Broadband Report
Chart 15.2: The ratio of weighted median upload speed to advertised upload speed, peak versus off-peak.
Chart 16 below shows the actual download speed to advertised speed ratio in each two-hour time block
during weekdays for each ISP. The ratio is lowest during the busiest four-hour time block (7:00 p.m. to
11:00 p.m.).
34
Tenth Measuring Broadband America Fixed Broadband Report
Chart 16: The ratio of median download speed to advertised download speed, Monday-to-Friday, two-
hour time blocks, terrestrial ISPs.
35
Tenth Measuring Broadband America Fixed Broadband Report
For each ISP, Chart 6 (in Section 2.C) showed the ratio of the 80/80 consistent median download speed to
advertised download speed, and for comparison, Chart 4 showed the ratio of median download speed to
advertised download speed.
Chart 17.1 illustrates information concerning 80/80 consistent upload speeds. While all the upload 80/80
speeds were slightly lower than the median speed the differences were more marked in DSL. Charts 6
and 17.1 make it clear that cable and fiber technologies behaved more consistently than DSL technology
both for download as well as upload speeds.
Chart 17.1: The ratio of 80/80 consistent upload speed to advertised upload speed.
Charts 17.2 and 17.3 below illustrate similar consistency metrics for 70/70 consistent download and
upload speeds, i.e., the minimum download or upload speed (as a percentage of the advertised download
or upload speed) experienced by at least 70% of panelists during at least 70% of the peak usage period.
The ratios for 70/70 consistent speeds as a percentage of the advertised speed are higher than the
corresponding ratios for 80/80 consistent speeds. In fact, for many ISPs, the 70/70 consistent download
or upload speed is close to the median download or upload speed. Once again, ISPs using DSL technology
showed a considerably smaller value for the 70/70 download and upload speeds as compared to the
download and upload median speeds, respectively.
36
Tenth Measuring Broadband America Fixed Broadband Report
Chart 17.2: The ratio of 70/70 consistent download speed to advertised download speed.
Chart 17.3: The ratio of 70/70 consistent upload speed to advertised upload speed.
D. LATENCY
Chart 18 below shows the weighted median latencies, by technology and by advertised download speed
for terrestrial technologies. For all terrestrial technologies, latency varied little with advertised download
speed. DSL service typically had higher latencies, and lower latency was better correlated with advertised
download speed, than with either cable or fiber. Cable latencies ranged between 16ms to 28ms, fiber
latencies between 5ms to 11ms, and DSL between 21ms to 61ms.
37
Tenth Measuring Broadband America Fixed Broadband Report
Chart 18: Latency for Terrestrial ISPs, by technology, and by advertised download speed.
38
Tenth Measuring Broadband America Fixed Broadband Report
Chart 19.1: The ratio of median download speed to advertised download speed, by ISP (1-5 Mbps).
39
Tenth Measuring Broadband America Fixed Broadband Report
Chart 19.2: The ratio of median download speed to advertised download speed, by ISP (6-10 Mbps).
40
Tenth Measuring Broadband America Fixed Broadband Report
Chart 19.3: The ratio of median download speed to advertised download speed, by ISP (12-25 Mbps).
41
Tenth Measuring Broadband America Fixed Broadband Report
Chart 19.4: The ratio of median download speed to advertised download speed, by ISP (30-60 Mbps).
42
Tenth Measuring Broadband America Fixed Broadband Report
Chart 19.5: The ratio of median download speed to advertised download speed, by ISP (75-100Mbps).
43
Tenth Measuring Broadband America Fixed Broadband Report
Chart 19.6: The ratio of median download speed to advertised download speed, by ISP (150-200 Mbps).
44
Tenth Measuring Broadband America Fixed Broadband Report
Chart 19.7: The ratio of median download speed to advertised download speed, by ISP (250-500 Mbps).
Charts 20.1 – 20.6 depict the ratio of median upload speeds to advertised upload speeds for each ISP by
service tier. Out of the 30 upload speed tiers that were measured a large majority (25) showed that they
at least achieved 90% of the advertised upload speed, and 21 of the 30 tiers either met or exceeded the
advertised upload speed.
45
Tenth Measuring Broadband America Fixed Broadband Report
Chart 20.1: The ratio of median upload speed to advertised upload speed, by ISP (0.768 - 1 Mbps).
46
Tenth Measuring Broadband America Fixed Broadband Report
Chart 20.2: The ratio of median upload speed to advertised upload speed, by ISP (1.5-5 Mbps).
47
Tenth Measuring Broadband America Fixed Broadband Report
Chart 20.3: The ratio of median upload speed to advertised upload speed, by ISP (10 -20 Mbps).
48
Tenth Measuring Broadband America Fixed Broadband Report
Chart 20.4: The ratio of median upload speed to advertised upload speed, by ISP (30-75 Mbps).
49
Tenth Measuring Broadband America Fixed Broadband Report
Chart 20.5: The ratio of median upload speed to advertised upload speed, by ISP (100–200 Mbps).
Table 2 lists the advertised download service tiers included in this study. For each tier, an ISP’s advertised
download speed is compared with the median of the measured download speed results. As we noted in
the past reports, the download speeds listed here are based on national averages and may not represent
the performance experienced by any particular consumer at any given time or place.
Table 2: Peak period median download speed, sorted by actual download speed
50
Tenth Measuring Broadband America Fixed Broadband Report
51
Tenth Measuring Broadband America Fixed Broadband Report
E. VARIATIONS IN SPEED
In Section 3.C above, we present speed consistency metrics for each ISP based on test results averaged
across all service tiers. In this section, we provide detailed speed consistency results for each ISP’s
individual service tiers. Consistency of speed is important for services such as video streaming. A
significant reduction in speed for more than a few seconds can force a reduction in video resolution or an
intermittent loss of service.
Charts 21.1 – 21.3 below show the percentage of consumers that achieved greater than 95%, between
85% and 95%, or less than 80% of the advertised download speed for each ISP speed tier. Consistent with
past performance, ISPs using DSL technology frequently fail to deliver advertised service rates. ISPs quote
a single ‘up-to’ speed, but the actual speed of DSL depends on the distance between the subscriber and
the serving central office.
Cable companies, in general, showed a high consistency of speed.
52
Tenth Measuring Broadband America Fixed Broadband Report
Chart 21.1: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed, by service tier (DSL).
53
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 21.2: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed (cable).
54
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 21.3: The percentage of consumers whose median download speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised download speed (fiber).
Similarly, Charts 22.1 to 22.3 show the percentage of consumers that achieved greater than 95%, between
85% and 95%, or less than 80% of the advertised upload speed for each ISP speed tier.
Chart 22.1: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (DSL).
55
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 22.2: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (cable).
56
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 22.3: The percentage of consumers whose median upload speed was greater than 95%, between
80% and 95%, or less than 80% of the advertised upload speed (fiber).
In Section 3.C above, we present complementary cumulative distributions for each ISP based on test
results across all service tiers. Below, we provide tables showing selected points on these distributions
by each individual ISP. In general, DSL technology showed performance between 25% and 77% of
advertised speed for at least 95% of their subscribers. Among cable-based companies, the average
download speeds that at least 95% of their subscribers received were between 92% and 100% of
advertised rates. Fiber-based services provided a range from 71% to 96% of advertised download speeds
for at least 95% of subscribers.
Table 3: Complementary cumulative distribution of the ratio of median download speed to advertised
download speed by ISP.
57
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Table 4: Complementary cumulative distribution of the ratio of median upload speed to advertised upload
speed by ISP.
58
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
59
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
60
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 23.5: Average webpage download time, by ISP (75 - 100 Mbps).
61
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 23.6: Average webpage download time, by ISP (150 - 200 Mbps).
62
Tenth Measuring Broadband America Fixed Broadband Report Federal Communications Commission
Chart 23.7: Average webpage download time, by ISP (250 - 500 Mbps).
63
Measuring Broadband America
Technical Appendix to the Tenth MBA Report
FCC’s Office of Engineering and Technology
Table of Contents
LIST OF TABLES
LIST OF FIGURES
This Appendix to the Tenth Measuring Broadband America Report,1 a report on consumer
wireline broadband performance in the United States, provides detailed technical background
information on the methodology that produced the Report. It covers the process by which the
panel of consumer participants was originally recruited and selected for the August 2011 MBA
Report, and maintained and evolved over the last ten years. This Appendix also discusses the
testing methodology used for the Report and describes how the test data was analyzed.
2 - PANEL CONSTRUCTION
This section describes the background of the study, as well as the methods employed to design
the target panel, select volunteers for participation, and manage the panel to maintain the
operational goals of the program.
The study aims to measure fixed broadband service performance in the United States as
delivered by an Internet Service Provider (ISP) to the consumer’s broadband modem. Many
factors contribute to end-to-end broadband performance, only some of which are under the
control of the consumer’s ISP. The methodology outlined here is focused on the measurement
of broadband performance within the scope of an ISP’s network, and specifically focuses on
measuring performance from the consumer Internet access point, or consumer gateway, to a
close major Internet gateway point. The actual quality of experience seen by consumers depends
on many other factors beyond the consumer’s ISP, including the performance of the consumer’s
in-home network, transit providers, interconnection points, content distribution networks (CDN)
and the infrastructure deployed by the providers of content and services. The design of the study
methodology allows it to be integrated with other technical measurement approaches that focus
on specific aspects of broadband performance (i.e., download speed, upload speed, latency,
packet loss), and in the future, could focus on other aspects of broadband performance.
1
The First Report (2011) was based on measurements taken in March 2011, the Second Report (2012) on
measurements taken in April 2012, and the Third (2013) through this, the Tenth (2020) Reports on measurements
taken in September of the year prior to the reports’ release dates.
The Whiteboxes remain in consumer homes and continue to run the tests described in this report. Participants may
remain in the measurement project as long as it continues and may retain their Whitebox when they end their
participation.
5
[Link]
• The volunteer sample was originally organized with a goal of covering major ISPs in the
48 contiguous states across five broadband technologies: DSL, cable, fiber-to-the-home,
fixed terrestrial wireless, and satellite.6
• Target numbers for volunteers were set across the four Census Regions—Northeast,
Midwest, South, and West—to help ensure geographic diversity in the volunteer panel
and compensate for differences in networks across the United States.7
• A target plan for allocation of Whiteboxes was developed based on the market share of
participating ISPs. Initial market share information was based principally on FCC Form
4778 data filed by participating ISPs for December 2018. This data is further enhanced by
the ISPs who brief SamKnows on new products and changes in subscribership numbers
which may have occurred after the submission of the 477 data. Speed tiers that comprise
the top 80% of a Participating ISP’s subscriber base are included. This threshold ensures
that we are measuring the ISP’s most popular speed tiers and that it is possible to recruit
sufficient panelists.
• An initial set of prospective participants was selected from volunteers who had responded
directly to SamKnows as a result of media solicitations, as described in detail in Section
2.3. Where gaps existed in the sample plan, SamKnows worked with participating ISPs via
email solicitations targeted at underrepresented tiers.
• Since the initial panel was created in 2011, participating ISPs have contacted random
subsets of their subscribers by email to replenish cells that were falling short of their
desired panel size. Additional recruitment via social media, press releases and blog posts
has also taken place.
The sample plan is designed prior to the reporting period and is sent to each ISP by SamKnows.
ISPs review this and respond directly to SamKnows with feedback on speed tiers that ought to be
included based on the threshold criteria stated above. SamKnows will include all relevant tiers in
the final report, assuming a target sample size is available. As this may not be known until after
the reporting period is over, a final sample description containing all included tiers is produced
and shared with the FCC and ISPs once the reporting period has finished and the data has been
processed. Test results from a total of 2,931 panelists were used in the Tenth MBA Report. This
6 At the request of, and with the cooperation of the Department of Commerce and Consumer Affairs, Hawaii, we
are also collecting data from the state of Hawaii.
7 Although the Commission’s volunteer recruitment was guided by Census Region to ensure the widest possible
distribution of panelists throughout the United States, as discussed below, a sufficient number of testing devices
were not deployed to enable, in every case, the evaluation of regional differences in broadband performance. The
States associated with each Census Region are described in Table 4.
8 The FCC Form 477 data collects information about broadband connections to end user locations, wired and wireless
local telephone services, and interconnected Voice over Internet Protocol (VoIP) services. See
[Link] for further information.
figure includes only panelists that are subscribed to the tiers that were tested as part of the
sample plan.
The recruitment campaign resulted in the coverage needed to ensure balanced representation
of users across the United States. Table 1 shows the number of volunteers with reporting
Whiteboxes for the months of September/October 2019 listed by ISP, as well as the percentage
of total volunteers subscribed to each ISP. Tables 2 and 3 shows the distributions of the
Whiteboxes by State and by Region respectively. This can be compared with the percentage of
subscribers per state or region.9
Table 1: ISPs, Sample Sizes and Percentages of Total Volunteers
9 Subscriber data in the Tenth MBA Report is based on the FCC’s Internet Access Services Report with data current
to December 31, 2017. See Internet Access Services: Status as of Dec 30, 2017, Wireline Competition Bureau,
Industry Analysis and Technology Division (rel. Nov. 2018), available at
[Link]
% of Total US
State Total Boxes % of Total Boxes
Broadband
Alabama 25 0.9% 1.50%
Alaska 0 0.0% 0.23%
Arizona 121 4.1% 1.97%
Arkansas 15 0.5% 0.86%
California 179 6.1% 12.17%
Colorado 86 2.9% 1.71%
Connecticut 56 1.9% 1.13%
Delaware 9 0.3% 0.30%
District of Columbia 4 0.1% 0.27%
Florida 182 6.2% 6.56%
Georgia 85 2.9% 3.18%
Hawaii 17 0.6% 0.47%
Idaho 23 0.8% 0.48%
Illinois 47 1.6% 3.92%
Indiana 43 1.5% 1.92%
Iowa 146 5.0% 0.90%
Kansas 14 0.5% 1.21%
Kentucky 106 3.6% 1.35%
Louisiana 15 0.5% 1.41%
Maine 0 0.0% 0.42%
Maryland 37 1.3% 1.91%
Massachusetts 48 1.6% 2.27%
Michigan 33 1.1% 2.98%
Minnesota 97 3.3% 1.68%
Mississippi 2 0.1% 0.86%
Missouri 67 2.3% 1.78%
The distribution of Whiteboxes by Census Region is found in the table on the next page.
Census Region Total Boxes % Total Boxes % Total U.S. Broadband Subscribers
The distribution of states associated with the four Census Regions used to define the panel strata
are included in the table below.
Northeast CT MA ME NH NJ NY PA RI VT
Midwest IA IL IN KS MI MN MO ND NE OH SD WI
AL AR DC DE FL GA KY LA MD MS NC OK SC TN TX
South
VA WV
West AK AZ CA CO HI ID MT NM NV OR UT WA WY
• Recruitment has evolved since the start of the program. At that time, (2011) several
thousand volunteers were initially recruited through an initial public relations and social
media campaign led by the FCC. This campaign included discussion on the FCC website
and on technology blogs, as well as articles in the press. Currently volunteers are drafted
with the help of a recruitment website10 which keeps them informed about the MBA
program and allows them to view MBA data on a dashboard. The composition of the
panel is reviewed each year to identify any deficiencies with regard to the sample plan
described above. Target demographic goals are set for volunteers based on ISP, speed
tier, technology type, and region. Where the pool of volunteers falls short of the desired
goal, ISPs send out email messages to their customers asking them to participate in the
MBA program. The messages direct interested volunteers to contact SamKnows to
request participation in the trial. The ISPs do not know which of the email recipients
volunteer. In almost all cases, this ISP outreach allows the program to meet its desired
demographic targets.
The mix of panelists recruited using the above methodologies varies by ISP.
A multi-mode strategy was used to qualify volunteers for the 2019 testing period. The key stages
of this process were as follows:
1. Volunteers were directed to complete an online form which provided information on the
study and required volunteers to submit a small amount of information.
2. Volunteers were selected from respondents to this follow-up email based on the target
requirements of the panel. Selected volunteers were then asked to agree to the User
Terms and Conditions that outlined the permissions to be granted by the volunteer in key
areas such as privacy.11
3. From among the volunteers who agreed to the User Terms and Conditions, SamKnows
selected the panel of participants,12 each of whom received a Whitebox for self-
installation. SamKnows provided full support during the Whitebox installation phase.
10
The Measuring Broadband America recruitment website is: [Link]
11 The User Terms and Conditions is found in the Reference Documents at the end of this Appendix.
12 Over 23,000 Whiteboxes have been shipped to targeted volunteers since 2011, of which 6,006 were online and
reporting data from the months of September/October 2019.
SamKnows manually completed the following four steps for each panelist:
• Verified that the IP address was in a valid range for those served by the ISP.
• Reviewed data for each panelist and removed data where speed changes such as tier
upgrade or downgrade appeared to have occurred, either due to a service change on the
part of the consumer or a network change on the part of the ISP.
• Identified panelists whose throughput appeared inconsistent with the provisioned service
tier. Such anomalies were re-certified with the consumer’s ISP.14
• Verified that the resulting downstream-upstream test results corresponded to the ISP-
provided speed tiers and updated accordingly if required.
13 Past FCC studies found that a high rate of consumers could not reliably report information about their broadband
service, and the validation of subscriber information ensured the accuracy of expected speed and other subscription
details against which observed performance was measured. See John Horrigan and Ellen Satterwhite, Americans’
Perspectives on Online Connection Speeds for Home and Mobile Devices, 1 (FCC 2010), available at
[Link] (finding that 80 percent of broadband
consumers did not know what speed they had purchased).
14 For example, when a panelist’s upload or download speed was observed to be significantly higher than that of
the rest of the tier, it could be inferred that a mischaracterization of the panelist’s service tier had occurred. Such
anomalies, when not resolved in cooperation with the service provider, were excluded from the Tenth Report, but
will be included in the raw bulk data set.
Of the more than 23,000 Whiteboxes that were shipped to panelists since 2011, 6,00615 units
reported sufficient data in September/October 2019, with the participating ISPs validating 4,964
for the reporting period. Of the validated units, 17 percent were reallocated to a different tier
following the steps listed above. A total of 2,931 validated units were part of download or upload
tiers included in the sample plan and were ultimately included in this report.
A total of 3,075 boxes were excluded for the following reasons:
• 1,763 belonged to users subscribed to plans that were not included in this study
• 263 were excluded due to legacy equipment such as modem that could not fully support
the subscribed speeds
• 291 Whiteboxes were legacy models that could not fully support the plan speeds
• 293 belonged to users whose details or subscribed tier could not be successfully validated
by the ISP
• 142 Whiteboxes were excluded due to ethernet limitations
• 23 were connected to non-residential plans
• 1 Whitebox was a test unit not to be included in the program
• 7 belonged to employees of ISPs taking part in the MBA program
• And a further 292 were excluded as the test speed profile did not match the product
validated by the ISP.
15 This figure represents the total number of boxes reporting during September/October 2019, the month chosen
for the Tenth Report. Shipment of boxes continued in succeeding months and these results will be included in the
raw bulk data set.
This section describes the system architecture and network programming features of the tests,
and other technical aspects of the methods employed to measure broadband performance
during this study.
• If software tests are performed manually, panelists might only run tests when they
experience problems and thus bias the results.
In contrast, the hardware approach used in the MBA program requires the placement of the
previously described Whitebox inside the user’s home, directly connected to the consumer’s
service interconnection device (router), via Ethernet cable. The measurement device therefore
directly accesses fixed Internet service to the home over this dedicated interface and periodically
runs tests to remote targets over the Internet. The use of hardware devices avoids the
disadvantages listed earlier with the software approach. However, hardware approaches are
much more expensive than the software alternative, are thus more constrained in the achievable
panel size, and require correct installation of the device by the consumer or a third party. This is
still subject to unintentional errors due to misconfigurations, i.e., connecting the Whitebox
incorrectly but these can often be detected in the validation process that follows installation. The
FCC chose the hardware approach since its advantages far outweigh these disadvantages.
The Whitebox measurement process The Whitebox measurement process is designed to provide
1 must not change during the monitoring automated and consistent monitoring throughout the
period. measurement period.
Must be compatible with a wide range Whiteboxes can be connected to all modem types commonly
8 of DSL, cable, satellite and fiber-to-the- used to support broadband services in the U.S., either in a
home modems. routing or bridging mode, depending on the model.
Must be upgradeable remotely if it The Whitebox can be completely controlled remotely for
11 contains any software or firmware updates without involvement of the consumer, providing the
components. Whitebox is switched on and connected.
16 Signatories to the Code of Conduct are: CenturyLink, Charter, Cincinnati Bell, Comcast, Cox, Frontier, Level3,
Measurement Lab, Mediacom, NCTA, Optimum, Time Warner Cable, Verizon and Windstream. A copy of the Code
of Conduct is included as a Reference Document attached to this Appendix.
17Each reporting interface included a data dashboard for the consumer volunteers, which provided performance
metrics associated with their Whitebox.
18 The use of legacy equipment has the potential to impede some panelists from receiving the provisioned speed
from their ISP, and this impact is captured by the survey.
their Internet service and that any such use does not interfere with testing or invalidate test
results.
Panelists were not asked to change their wireless network configurations. Since the TP-Link
Whiteboxes and Whitebox 8.0 attach to the panelist’s router that may contain a built-in wireless
(Wi-Fi) access point, these devices measure the strongest wireless signal. Since they only count
packets, they do not need access to the Wi-Fi encryption keys and do not inspect packet content.
AT&T 9
Comcast 37
Cox 2
19 Stackpath was added to the list of hosting providers for the MBA project to provide further resilience for the testing platform.
Stackpath servers have a minimum 10Gbps – 200Gbps transit / peering links and have been located in the major US cities as per
the other hosting providers used for the program.
Frontier 5
Hawaiian Telecom 1
Level 3 (off-net) 13
M-Lab (off-net) 51
Mediacom 1
Optimum 3
Uhnet (Hawaii) 1
Verizon 2
Windstream 4
Stackpath 10
The Level 3 nodes were located in the following major U.S. Internet peering locations:
• Chicago, Illinois (two locations)
• Dallas, Texas (two locations)
• New York City, New York (two locations)
• San Jose, California (two locations)
• Washington D.C. (two locations)
• Los Angeles, California (three locations)
The Stackpath nodes were located in the following major U.S. Internet peering locations:
• Ashburn, Virginia (one location)
• Atlanta, Georgia (one location)
• Chicago, Illinois (one location)
• Dallas, Texas (one location)
• Los Angeles, California (one location)
• New York City, New York (one location)
• San Jose, California (one location)
• Seattle, Washington (one location)
• Denver, Colorado (one location)
• Miami, Florida (one location)
• CenturyLink20
• Charter21
• Cincinnati Bell
• Comcast
• Cox
• Frontier
• Mediacom
• Optimum
• Verizon
• Windstream
The same suite of tests was scheduled for these on-net nodes as for the off-net nodes and the
same server software developed by SamKnows was used regardless of whether the Whitebox
was interacting with on-net or off-net nodes. Off-net test nodes are continually monitored for
load and congestion.
While these on-net test nodes were included in the testing, the results from these tests were
used as a control set; the results presented in the Report are based only on tests performed using
off-net nodes. Results from both on-net and off-net nodes are included in the raw bulk data set
that will be released to the public.
20 QWest was reported separately from Centurylink in reports prior to 2016. The entities completed merging their
test infrastructure in 2016.
21
Time Warner Cable was reported separately from Charter in reports prior to the Eighth report. The entities
completed merging their test infrastructure in early 2018.
Web Browsing
The test records the averaged time taken to sequentially download the HTML and referenced
resources for the home page of each of the target websites, the number of bytes transferred,
and the calculated rate per second. The primary measure for this test is the total time taken to
download the HTML front page for each web site and all associated images, JavaScript, and
stylesheet resources. This test does not measure against the centralized testing nodes; instead
it tests against actual websites, ensuring that the effects of content distribution networks and
other performance enhancing factors can be taken into account.
Each Whitebox tests against the following nine websites:25
• [Link] • [Link]
• [Link] • [Link]
• [Link] • [Link]
• [Link] • [Link]
The results include the time needed for DNS resolution. The test uses up to eight concurrent TCP
connections to fetch resources from targets. The test pools TCP connections and utilizes
persistent connections where the remote HTTP server supports them.
The client advertises the user agent as Microsoft Internet Explorer 10. Each website is tested in
sequence and the results summed and reported across all sites.
25These websites were chosen based on a list by Alexa, [Link] of the top twenty websites in
October 2010.
Voice over IP
The Voice over IP (VoIP) test operates over UDP and utilizes bidirectional traffic, as is typical for
voice calls.
The Whitebox handshakes with the server, and each initiates a UDP stream with the other. The
test uses a 64 kbps stream with the same characteristics and properties (i.e., packet sizes, delays,
bitrate) as the G.711 codec. 160 byte packets are used. The test measures jitter, delay, and loss.
Jitter is calculated using the Packet Delay Variation (PDV) approach described in section 4.2 of
RFC 5481. The 99th percentile is recorded and used in all calculations when deriving the PDV.
Traceroute
A traceroute client is used to send UDP probes to each hop in the path between client and
destination. Three probes are sent to each hop. The round-trip times, the standard deviation of
the round-trip times of the responses from each hop and the packet loss are recorded. The open
source traceroute client "mtr" ([Link] is used for carrying out the
traceroute measurements.
Upload Speed (Single TCP 1 off-net test Once in peak hours, Fixed 10 6 MB at
connection) node once in off-peak seconds 1 Mbps
1 on-net test hours
node
UDP Latency 2 off-net test Hourly, 24x7 Permanent 5.8 MB
nodes
(Level3/MLab)
1 on-net test Hourly, 24x7 Permanent 2.9 MB
node
UDP Packet Loss 2 off-net test Hourly, 24x7 Permanent N/A (uses
node above)
1 on-net test Hourly, 24x7 Permanent N/A (uses
nodes above)
Consumption N/A 24x7 N/A N/A
DNS Resolution 10 popular US Hourly, 24x7 Est. 3 0.3 MB
websites seconds
ICMP Latency 1 off-net test Hourly, 24x7 Est. 5 0.3 MB
node seconds
1 on-net test
node
ICMP Packet loss 1 off-net test Hourly, 24x7 N/A (As N/A (uses
node IMCP above)
1 on-net test latency)
node
Traceroute 1 off-net test Three times a day, N/A N/A
node 24x7
1 on-net test
node
Download Speed 1 off-net test Three times a day Fixed 10 180 MB at
IPv6^^ node seconds 50 Mbps
72 MB at
20 Mbps
11 MB at
3 Mbps
5.4 MB at
1.5 Mbps
Upload Speed 1 off-net test Three times a day Fixed 10 172 MB at
IPv6^^ node seconds 2 Mbps
3.6MB at
1 Mbps
1.8MB at
0.5 Mbps
UDP Latency / Loss 2 off-net test Hourly, 24x7 Permanent 5.8 MB
IPv6^^ nodes
(Level3/MLab)
Lightweight Capacity Test 1 off-net test Once 12am-6am, Fixed 1000 9MB
– Download (UDP) node packets
Lightweight capacity test – 1 off-net test Once 12am-6am, Fixed 1000 9MB
Upload (UDP) node Once 6am-12pm, packets
Once 12pm-6pm,
Hourly thereafter
**Download/upload daily volumes are estimates based upon likely line speeds. All tests will operate
at maximum line rate so actual consumption may vary.
^Currently in beta testing.
^^Only carried out on broadband connections that support IPv6.
Tests to the off-net destinations use the nearest (in terms of latency) server from the Level3, M-
Lab and StackPath list of test servers. The one exception is the latency and packet loss tests,
which operate continuously to Level3, M-Lab and StackPath off-net servers. All tests are also
performed to the closest on-net server, where available.
Consumption
This test was replaced by the new data usage test. A technical description for this test is
outlined here: [Link]
08-24_Final-[Link]
traffic is set to 64 kbps downstream and 32 kbps upstream. Metrics are sampled and computed
every 10 seconds. If either of these thresholds is exceeded, the test is delayed for a minute and
the process repeated. If the connection is being actively used for an extended period of time,
this pause and retry process continues for up to five times before the test is abandoned.
This section describes the background for the categorization of data gathered for the Tenth
Report, and the methods employed to collect and analyze the test results.
4.1 - BACKGROUND
Time of Day
Most of the metrics reported in the Tenth Report draw on data gathered during the so-called
peak usage period of 7:00 p.m. to 11:00 p.m. local time26. This time period is generally considered
to experience the highest amount of Internet usage under normal circumstances.
(a) The speed tier must make up the top 80% of the ISP’s subscriber base;
(b) There must be a minimum of 45 panelists that are recruited for that tier who have
provided valid data for the tier within the validation period; and
(c) Each panelist must have a minimum of five days of valid data within the validation period.
The study achieved target sample sizes for the following download and upload speeds27 (listed in
alphabetical order by ISP):
Download Speeds:
CenturyLink: 1.5, 3, 7, 10, 12, 20, and 40 Mbps tiers;
26
This period of time was agreed to by ISP participants in open meetings conducted at the beginning of the program.
27 Due to the large number of different combinations of upload/download speed tiers supported by ISPs where, for
example, a single download speed might be offered paired with multiple upload speeds or vice versa, upload and
download test results were analyzed separately.
Upload Speeds:
CenturyLink: 0.768, 0.896, 2, and 5 Mbps tiers;
Charter: 10, and 20 Mbps tiers;
Cincinnati Bell DSL: 0.768 and 3 Mbps tiers;
Cincinnati Bell Fiber: 10, 100 and 125 Mbps tiers;
Comcast: 5 and 10 Mbps tiers;
Cox: 3, 10, and 30 Mbps tiers;
Frontier DSL: 0.768 Mbps tier;
Frontier Fiber: 50, 75, 100, 150 and 200 Mbps tiers;
Mediacom: 5, 10 and 20 Mbps tiers;
Optimum: 35 Mbps tier;
Verizon Fiber: 75 and 100 Mbps tiers;29
Windstream: 1 and 1.5 Mbps tiers.
A file containing averages for each metric from the validated September/October 2019 data can
be found on FCC’s Measuring Broadband America website.30 Some charts and tables are divided
into speed bands, to group together products with similar levels of advertised performance. The
results within these bands are further broken out by ISP and service tier. Where an ISP does not
offer a service tier within a specific band or a representative sample could not be formed for
tier(s) in that band, the ISP will not appear in that speed band.
28
Verizon’s 1 Gbps tier was not included in the final report. 1Gbps tiers may be included in a separate/subsequent
report focusing on faster speeds.
29
Verizon’s 1 Gbps tier was not included in the final report. Id at n. 28.
30 See: [Link]
Results from tests run on speed tiers of 1Gbps were not included in the Tenth Report. This was
due to concerns from ISPs that the Whitebox 8.0 could not measure these speeds accurately. An
investigation was conducted to establish if this was the case, or if speeds of 1Gbps could be
reliably reported on.
Following investigation and testing with one of the ISPs which takes part in the program this
conclusion was reached:
The network of the ISP concerned was quite “bursty” in nature, with servers on a 1Gbps network
sometimes bursting to 3Gbps. This caused small amounts of packet loss which negatively affected
overall speed test results. However once implementing new traffic shaping rules restricting traffic
from the server to 1Gbps consistent high speeds were recorded by the Whitebox. The other
solution to this specific problem was seen when using a very large number of parallel TCP
connections. This investigation established that there is not an issue with the Whitebox 8.0
measuring speeds up to 1Gbps consistently.
Legacy Equipment
In previous reports, we discussed the challenges ISPs face in improving network performance
where equipment under the control of the subscriber limits the end-to-end performance
achievable by the subscriber.31 Simply, some consumer-controlled equipment may not be
capable of operating fully at new, higher service tiers. Working in open collaboration with all
service providers we developed a policy permitting changes in ISP panelists when their installed
modems were not capable of meeting the delivered service speed that included several
conditions on participating ISPs. First, proposed changes in consumer panelists would only be
considered where an ISP was offering free upgrades for modems they owned and leased to the
consumer. Second, each ISP needed to disclose its policy regarding the treatment of legacy
modems and its efforts to inform consumers regarding the impact such modems may have on
31 See pgs. 8-9, 2014 Report, pg. 8 of the 2013 Report, as well as endnote 14. [Link]
broadband-america/2012/july.
their service.
While the issue of DOCSIS 3 modems and network upgrades affect the cable industry today, we
may see other cases in the future where customer premises equipment affects the achievable
network performance.
In accordance with the above stated policy, 135 Whiteboxes connected to legacy modems were
identified and removed from the final data set in order to ensure that the study would only
include equipment that would be able to meet its advertised speed. The 95 excluded Whiteboxes
were connected to Charter, Comcast, and Cox.
32 These methods were reviewed with statistical experts by the participating ISPs.
advertised service tiers, may be most readily seen in those charts in the 2016 Report that show
performance over 24-hour periods, where tested rates for some ISPs and service tiers flatten for
periods at a time.
period and arrange it in increasing order. The speed that corresponds to the 20 th percentile
represents the minimum speed that the panelist experienced at least 80% of the time. The 20
percentile values of all the panelists on a specific tier are then arranged in an increasing order.
The speed that corresponds to the 20th percentile now represents the minimum speed that at
least 80% of panelists experienced 80% of the time. This is the value reported as the 80/80
consistent speed for that ISP’s tier. We also report on the 70/70 consistent speed for an ISP’s tier,
which is the minimum speed that at least 70% of the panelists experience at least 70% of the
time. We typically report the 70/70 and the 80/80 consistent speeds as a percentage of the
advertised speed.
When reporting on these values for an ISP, we weigh the 80/80 or 70/70 consistent speed results
(as a percentage of the advertised speed) of each of the ISP’s tier based on the number of
subscribers to that tier; so as to get a weighted average across all the tiers for that ISP.
Limiting Factors
A total of 8,417,695,058 measurements were taken across 144,636,223 unique tests.
All scheduled tests were run, aside from when monitoring units detected concurrent use of
bandwidth.
Schedules were adjusted when required for specific tests to avoid triggering data usage limits
applied by some ISPs.
The process flow below describes how the raw collected data was processed for the production
of the Measuring Broadband America Report. Researchers and developers interested in
replicating or extending the results of the Report are encouraged to review the process below
and supporting files that provide details.
Raw Data: Raw data for the chosen period is collected from the measurement database. The ISPs
and products that panelists were on are exported to a “unit profile” file, and those
that changed during the period are flagged. 2020 Raw Data Links
Data is cleaned. This includes removing measurements when a user changed ISP or
Validated Data tier during the period. Anomalies and significant outliers are also removed at this
Cleansing: point. A data cleansing document describes the process in detail. 2020 Data Cleansing
Document Link
Per-unit results are generated for each metric. Time-of-day averages are computed
and a trimmed median is calculated for each metric. The SQL scripts used here are
SQL Processing:
contained in SQL processing scripts available with the release of each report. 2020
SQL Processing Links
This document identifies the various details of each test unit, including ISP,
technology, service tier, and general location. Each unit represents one volunteer
Unit Profile:
panelists. The unit ID's were randomly generated, which served to protect the
anonymity of the volunteer panelists. 2020 Unit Profile link
A listing of units excluded from the analysis due to insufficient sample size for that
Excluded Units:
particular ISP’s speed tier. 2020 Excluded Units Link
This step identifies the census block (for blocks containing more than 1,000 people) in
which each unit running tests is located. Census block is from 2010 census and is in
the FIPS code format. We have used block FIPS codes for blocks that contains more
Unit Census
than 1,000 people. For blocks with fewer than 1,000 people we have aggregated to
Block:
the next highest level, i.e., tract, and used the Tract FIPS code, provided there are more
than 1,000 people in the tract. In cases where there are less than 1,000 people in a
tract we have aggregated to Regional level. 2020 Unit Census Block Link.
Excel Tables & Summary data tables and charts in Excel are produced from the averages. These are
Charts: used directly in the report. 2020 Statistical Averages Links
The raw data collected for each active metric is made available by month in tarred gzipped files.
The files in the archive containing active metrics are described in table 9.
curr_dlping.csv
unit_id Unique identifier for an individual unit
dtime Time test finished
target Target hostname or IP address
33 This data dictionary is also available on the FCC Measuring Broadband America website, located with the other
validated data files available for download.
curr_lct_dl.csv
unit_id Unique identifier for an individual unit
dtime Time test finished in UTC
curr_lct_ul.csv
unit_id Unique identifier for an individual unit
dtime Time test finished in UTC
target Target hostname
address Target IP address
packets_received Total number of packets received
packets_sent Total number of packets sent
packet_size Packet size
bytes_total Total number of bytes
duration Duration of the test in microseconds
bytes_sec Throughput in bytes/sec
error_code An internal error code from the test.
successes Number of successes (always 1 or 0 for this test)
failures Number of failures (always 1 or 0 for this test)
location_id Please ignore (this is an internal key mapping to
unit profile data)
5 - REFERENCE DOCUMENTS
PLEASE READ THESE TERMS AND CONDITIONS CAREFULLY. BY APPLYING TO BECOME A PARTICIPANT
IN THE BROADBAND COMMUNITY PANEL AND/OR INSTALLING THE WHITEBOX, YOU ARE AGREEING TO
THESE TERMS AND CONDITIONS.
1. Interpretation
1.1. The following definitions and rules of interpretation apply to these terms & conditions.
Connection: the Participant's own broadband internet connection, provided by an Internet Service
Provider ("ISP").
Connection Equipment: the Participant's broadband router or cable modem, used to provide the
Participant's Connection.
Intellectual Property Rights: all patents, rights to inventions, utility models, copyright and related rights,
trademarks, service marks, trade, business and domain names, rights in trade dress or get-up, rights in
goodwill or to sue for passing off, unfair competition rights, rights in designs, rights in computer software,
database right, moral rights, rights in confidential information (including know-how and trade secrets)
and any other intellectual property rights, in each case whether registered or unregistered and including
all applications for and renewals or extensions of such rights, and all similar or equivalent rights or forms
of protection in any part of the world.
ISP: the company providing broadband internet connection to the Participant during the term of this
Program.
Participant/You/Your: the person who volunteers to participate in the Program, under these terms and
conditions. The Participant must be the named account holder on the Internet service account with the
ISP.
Participant's Equipment: any equipment, systems, cabling or facilities provided by the Participant and
used directly or indirectly in support of the Services, excluding the Connection Equipment.
Requirements: the requirements specified by SamKnows as part of the sign-up process that the
Participant must fulfil in order to be selected to receive the Services.
SamKnows/We/Our: the organization providing the Services and conducting the Program, namely:
SamKnows Limited (Co. No. 6510477) of 25 Harley Street, London W1G 9BR
Services / Program: the performance and measurement of certain broadband and Internet services and
research program (Broadband Community Panel), as sponsored by the Federal Communications
Committee (FCC), in respect of measuring broadband Internet Connections.
Software: the software that has been installed and/or remotely uploaded onto the Whitebox, by
SamKnows as updated by SamKnows, from time to time, but not including any Open Source Software.
Whitebox: the hardware supplied to the Participant by SamKnows with the Software.
1.2. Headings in these terms and conditions shall not affect their interpretation.
1.3. A person includes a natural person, corporate or unincorporated body (whether or not having
separate legal personality).
[Link] obligation in these terms and conditions on a person not to do something includes, without
limitation, an obligation not to agree, allow, permit or acquiesce in that thing being done.
2.1 Subject to the Participant complying fully with these terms and conditions, SamKnows shall use
reasonable care to:
(a) provide the Participant with the Measurement Services under these terms and conditions;
(c) if requested, SamKnows will provide a pre-paid postage label for the Whitebox to be returned.
(d) comply with all applicable United States, European Union, and United Kingdom privacy laws and
directives, and will access, collect, process and distribute the information according to the following
principles:
Specific purpose: We will access, collect, process, store and distribute data for the purposes and reasons
specified in this agreement and not in ways incompatible with those purposes;
Restricted: We will restrict our data collection and use practices to those adequate and relevant, and not
excessive in relation to the purposes for which we collect the information;
Accurate: We will work to ensure that the data we collect is accurate and up-to-date, working with
Participant and his/her ISP;
Destroyed when obsolete: We will not maintain personal data longer than is necessary for the purposes
for which we collect and process the information;
Security: We will collect and process the information associated with this trial with adequate security
through technical and organizational measures to protect personal data against destruction or loss,
alteration, unauthorized disclosure or access, in particular where the processing involves the transmission
of data over a network.
(a) provide Participant with access to a Program-specific customer services email address, which the
Participant may use for questions and to give feedback and comments;
(b) provide Participant with a unique login and password in order to access to an online reporting system
for access to Participant's broadband performance statistics.
(c) provide Participant with a monthly email with their specific data from the Program or notifying
Participant that their individual data is ready for viewing;
(d) provide Participant with support and troubleshooting services in case of problems or issues with their
Whitebox;
(e) notify Participant of the end of the FCC-sponsored Program and provide a mechanism for Participant
to opt out of any further performance/measuring services and research before collecting any data after
termination of the Program;
(f) use only data generated by SamKnows through the Whitebox, and not use any Participant data for
measuring performance without Participant's prior written consent; and
Federal Communications Commission 55 Measuring Broadband America
Technical Appendix to the Tenth MBA Report
(g) not monitor/track Participant's Internet activity without Participant's prior written consent.
2.3 While SamKnows will make all reasonable efforts to ensure that the Services cause no disruption to
the performance of the Participant's broadband Connection, including only running tests when there is
no concurrent network activity generated by users at the Participant's location. The Participant
acknowledges that the Services may occasionally impact the performance of the Connection and agrees
to hold SamKnows and their ISP harmless for any impact the Services may have on the performance of
their Connection.
3. Participant's Obligations
3.1 The Participant is not required to pay any fee for the provision of the Services by SamKnows or to
participate in the Program.
(a) connect the Whitebox to their Connection Equipment within 14 days of receiving it;
(b) not to unplug or disconnect the Whitebox unless (i) they will be absent from the property in which it
is connected for more than 3 days and/or (ii) it is reasonably necessary for maintenance of the
Participant's Equipment and the Participant agrees that they shall use reasonable endeavors to minimize
the length of time the Whitebox is unplugged or disconnected;
(c) in no way reverse engineer, tamper with, dispose of or damage the Whitebox, or attempt to do so;
(d) notify SamKnows within 7 days in the event that they change their ISP or their Connection tier or
package (for example, downgrading/upgrading to a different broadband package), to the email address
provided by SamKnows;
(e) inform SamKnows of a change of postal or email address by email; within 7 days of the change, to the
email address provided by SamKnows;
(f) agrees that the Whitebox may be upgraded to incorporate changes to the Software and/or additional
tests at the discretion of SamKnows, whether by remote uploads or otherwise;
(g) on completion or termination of the Services, return the Whitebox to SamKnows by mail, if requested
by SamKnows. SamKnows will provide a pre-paid postage label for the Whitebox to be returned;
(h) be an active part of the Program and as such will use all reasonable endeavors to complete the market
research surveys received within a reasonable period of time;
(i) not publish data, give press or other interviews regarding the Program without the prior written
permission of SamKnows; and
(k) contact SamKnows directly, and not your ISP, in the event of any issues or problems with the Whitebox,
by using the email address provided by SamKnows.
3.4 The Participant acknowledges that he/she is not an employee or agent of, or relative of, an employee
or agent of an ISP or any affiliate of any ISP. In the event that they become one, they will inform
SamKnows, who at its complete discretion may ask for the immediate return of the Whitebox.
3.5 THE PARTICIPANT'S ATTENTION IS PARTICULARLY DRAWN TO THIS CONDITION. The Participant
expressly consents to having their ISP provide to SamKnows and the Federal Communications (FCC)
information about the Participant's broadband service, for example: service address, speed tier, local loop
length (for DSL customers), equipment identifiers and other similar information, and hereby waives any
claim that its ISPs disclosure of such information to SamKnows or the FCC constitutes a violation of any
right or any other right or privilege that the Participant may have under any federal, state or local statute,
law, ordinance, court order, administrative rule, order or regulation, or other applicable law, including,
without limitation, under 47 U.S.C. §§ 222 and 631 (each a "Privacy Law"). If notwithstanding Participant's
consent under this Section 3.5, Participant, the FCC or any other party brings any claim or action against
any ISP under a Privacy Law, upon the applicable ISPs request SamKnows promptly shall cease collecting
data from such Participant and remove from its records all data collected with respect to such Participant
prior to the date of such request, and shall not provide such data in any form to the FCC. The Participant
further consents to transmission of information from this Program Internationally, including the
information provided by the Participant's ISP, specifically the transfer of this information to SamKnows in
the United Kingdom, SamKnows' processing of it there and return to the United States.
4.1 All Intellectual Property Rights relating to the Whitebox are the property of its manufacturer. The
Participant shall use the Whitebox only to allow SamKnows to provide the Services.
4.2 As between SamKnows and the Participant, SamKnows owns all Intellectual Property Rights in the
Software. The Participant shall not translate, copy, adapt, vary or alter the Software. The Participant shall
use the Software only for the purposes of SamKnows providing the Services and shall not disclose or
otherwise use the Software.
4.3 Participation in the Broadband Community Panel gives the participant no Intellectual Property Rights
in the Test Results. Ownership of all such rights is governed by Federal Acquisition Regulation Section
52.227-17, which has been incorporated by reference in the relevant contract between SamKnows and
the FCC. The Participant hereby acknowledges and agrees that SamKnows may make such use of the Test
Results as is required for the Program.
4.4 Certain core testing technology and aspects of the architectures, products and services are developed
and maintained directly by SamKnows. SamKnows also implements various technical features of the
measurement services using particular technical components from a variety of vendor partners including:
NetGear, Measurement Lab, TP-Link.
5. SamKnows' Property
6.1 This condition 6 sets out the entire financial liability of SamKnows (including any liability for the acts
or omissions of its employees, agents, consultants, and subcontractors) to the Participant, including and
without limitation, in respect of:
(a) any use made by the Participant of the Services, the Whitebox and the Software or any part of them;
and
(b) any representation, statement or tortious act or omission (including negligence) arising under or in
connection with these terms and conditions.
6.2 All implied warranties, conditions and other terms implied by statute or other law are, to the fullest
extent permitted by law, waived and excluded from these terms and conditions.
6.3 Notwithstanding the foregoing, nothing in these terms and conditions limits or excludes the liability
of SamKnows:
(a) for death or personal injury resulting from its negligence or willful misconduct;
(b) for any damage or liability incurred by the Participant as a result of fraud or fraudulent
misrepresentation by SamKnows;
(d) in relation to any other liabilities which may not be excluded or limited by applicable law.
6.4 Subject to condition 6.2 and condition 6.3, SamKnows' total liability in contract, tort (including
negligence or breach of statutory duty), misrepresentation, restitution or otherwise arising in connection
with the performance, or contemplated performance, of these terms and conditions shall be limited to
$100.
6.5 In the event of any defect or modification in the Whitebox, the Participant's sole remedy shall be the
repair or replacement of the Whitebox at SamKnows' reasonable cost, provided that the defective
Whitebox is safely returned to SamKnows, in which case SamKnows shall pay the Participant's reasonable
postage costs.
6.6 The Participant acknowledges and agrees that these limitations of liability are reasonable in all the
circumstances, particularly given that no fee is being charged by SamKnows for the Services or
participation in the Program.
7.1 The Participant acknowledges and agrees that his/her personal data, such as service tier, address and
line performance, will be processed by SamKnows in connection with the program.
7.2 Except as required by law or regulation, SamKnows will not provide the Participant's personal data to
any third party without obtaining Participant's prior consent. However, for the avoidance of doubt, the
Participant acknowledges and agrees that subject to the privacy polices discussed below, the specific
technical characteristics of tests and other technical features associated with the Internet Protocol
environment of architecture, including the client's IP address, may be shared with third parties as
necessary to conduct the Program and all aggregate statistical data produced as a result of the Services
(including the Test Results) may be provided to third parties.
7.3 You acknowledge and agree that SamKnows may share some of Your information with Your ISP, and
request information about You from Your ISP so that they may confirm Your service tiers and other
information relevant to the Program. Accordingly You hereby expressly waive claim that any disclosure by
Your ISP to SamKnows constitutes a violation of any right or privilege that you may have under any law,
wherever it might apply.
8.1 This Agreement shall continue until terminated in accordance with this clause.
8.2 Each party may terminate the Services immediately by written notice to the other party at any
time. Notice of termination may be given by email. Notices sent by email shall be deemed to be served
on the day of transmission if transmitted before 5.00 pm Eastern Time on a working day, but otherwise
on the next following working day.
(a) SamKnows shall have no further obligation to provide the Services; and
(b) the Participant shall safely return the Whitebox to SamKnows, if requested by SamKnows, in which
case SamKnows shall pay the Participant's reasonable postage costs.
8.4 Notwithstanding termination of the Services and/or these terms and conditions, clauses 1, 3.3 and 4
to 14 (inclusive) shall continue to apply.
9. Severance
10.1 These terms and conditions constitute the whole agreement between the parties and replace and
supersede any previous agreements or undertakings between the parties.
10.2 Each party acknowledges that, in entering into these terms and conditions, it has not relied on, and
shall have no right or remedy in respect of, any statement, representation, assurance or warranty.
11. Assignment
11.1 The Participant shall not, without the prior written consent of SamKnows, assign, transfer, charge,
mortgage, subcontract all or any of its rights or obligations under these terms and conditions.
11.2 Each party that has rights under these terms and conditions acknowledges that they are acting on
their own behalf and not for the benefit of another person.
Nothing in these terms and conditions is intended to, or shall be deemed to, constitute a partnership or
joint venture of any kind between any of the parties, nor make any party the agent of another party for
any purpose. No party shall have authority to act as agent for, or to bind, the other party in any way.
Except for the rights and protections conferred on ISPs under these Terms and Conditions which they may
defend, a person who is not a party to these terms and conditions shall not have any rights under or in
connection with these Terms and Conditions.
14.1 For the avoidance of doubt, the release of IP protocol addresses of client's Whiteboxes are not PII
for the purposes of this program and the client expressly consents to the release of IP address and other
technical IP protocol characteristics that may be gathered within the context of the testing architecture.
SamKnows, on behalf of the FCC, is collecting and storing broadband performance information, including
various personally identifiable information (PII) such as the street addresses, email addresses, sum of data
transferred, and broadband performance information, from those individuals who are participating
voluntarily in this test. PII not necessary to conduct this study will not be collected. Certain information
provided by or collected from you will be confirmed with a third party, including your ISP, to ensure a
representative study and otherwise shared with third parties as necessary to conduct the
program. SamKnows will not release, disclose to the public, or share any PII with any outside entities,
including the FCC, except as is consistent with the SamKnows privacy policy or these Terms and
Conditions. See [Link] The broadband performance
14.2 The FCC is soliciting and collecting this information authorized by OMB Control No. 3060-1139 in
accordance with the requirements and authority of the Paperwork Reduction Act, Pub. L. No. 96-511, 94
Stat. 2812 (Dec. 11, 1980); the Broadband Data Improvement Act of 2008, Pub. L. No. 110-385, Stat 4096
§ 103(c)(1); American Reinvestment and Recovery Act of 2009 (ARRA), Pub. L. No. 111-5, 123 Stat 115
(2009); and Section 154(i) of the Communications Act of 1934, as amended.
14.3 Paperwork Reduction Act of 1995 Notice. We have estimated that each Participant of this study will
assume a one hour time burden over the course of the Program. Our estimate includes the time to sign-
up online, connect the Whitebox in the home, and periodic validation of the hardware. If you have any
comments on this estimate, or on how we can improve the collection and reduce the burden it causes
you, please write the Federal Communications Commission, Office of Managing Director, AMD-PERM,
Washington, DC 20554, Paperwork Reduction Act Project (3060-1139). We will also accept your comments
via the Internet if you send an e-mail to PRA@[Link]. Please DO NOT SEND COMPLETED APPLICATION
FORMS TO THIS ADDRESS. You are not required to respond to a collection of information sponsored by
the Federal government, and the government may not conduct or sponsor this collection, unless it
displays a currently valid OMB control number and provides you with this notice. This collection has been
assigned an OMB control number of 3060-1139. THIS NOTICE IS REQUIRED BY THE PAPERWORK
REDUCTION ACT OF 1995, PUBLIC LAW 104-13, OCTOBER 1, 1995, 44 U.S.C. SECTION 3507. This notice
may also be found at [Link]
15. Jurisdiction
These terms and conditions shall be governed by the laws of the state of New York.
SCHEDULE
THE SERVICES
Subject to the Participant complying with its obligations under these terms and conditions, SamKnows
shall use reasonable endeavors to test the Connection so that the following information is recorded:
1. Web browsing
2. Video streaming
3. Voice over IP
4. Download speed
5. Upload speed
6. UDP latency
7. UDP packet loss
8. Consumption
1. SamKnows will perform tests on the Participant's Connection by using SamKnows' own data and will
not monitor the Participant's content or internet activity. The purpose of this study is to measure the
Connection and compare this data with other consumers to create a representative index of US
broadband performance.
WHEREAS the Federal Communications Commission of the United States of America (FCC) is
conducting a Broadband Testing and Measurement Program, with support from its contractor
SamKnows, the purpose of which is to establish a technical platform for the Measuring
Broadband America Program Fixed Broadband Testing and Measurement and further to use
that platform to collect data;
WHEREAS volunteer panelists have been recruited, and in so doing have agreed to provide
broadband performance information measured on their Whiteboxes to support the collection
of broadband performance data; and steps have been taken to protect the privacy of panelists
to the program’s effort to measure broadband performance. WE, THE UNDERSIGNED, as
participants and stakeholders in that Fixed Broadband Testing and Measurement, do hereby
agree to be bound by and conduct ourselves in accordance with the following principles and
shall:
3.1. It shall not be a violation of this principle for broadband providers to:
3.1.1. Operate and manage their business, including modifying or improving services
delivered to any class of subscribers that may or may not include panelists
among them, provided that such actions are consistent with normal business
practices, and
3.1.2. Address service issues for individual panelists at the request of the panelist or
based on information not derived from the trial;
3.2. It shall not be a violation of this principle for academic and research purposes to
simulate or observe tests and components of the testing architecture, provided that no
impact to MBA data or the Internet Service of the subscriber volunteer panelist occurs;
and
4. Not publish any data generated by the tests, nor make any public statement based on such
data, until such time as the FCC releases data, or except where expressly permitted by the
FCC; and
5. Not publish or make use of any test data or testing infrastructure in a manner that would
significantly reduce the anonymity of collected data, compromise panelists privacy, or
compromise the MBA privacy policy governing collection and analysis of data except that:
5.1. It shall not be a violation of this principle for stakeholder signatories under the
direction of the FCC to:
5.1.1. Make use of test data or testing infrastructure to support the writing of FCC
fixed Measuring Broadband America Reports;
5.1.2. Make use of test data or testing infrastructure to support various aspects of
the testing and architecture for the program including to facilitate data
processing or analysis;
5.1.3. Make use of test data or testing infrastructure to support the analysis of
collected data or testing infrastructure for privacy risks or concerns, and plan
for future measurement efforts;
6. Ensure that their employees, agents, and representatives, as appropriate, act in accordance
with this Code of Conduct.
Signatories: _____________________
Printed: ______________________
Date: _______________________
August 2013
Important Notice
Limitation of Liability
The information contained in this document is provided for general information purposes only.
While care has been taken in compiling the information herein, SamKnows does not warrant or
represent that this information is free from errors or omissions. To the maximum extent
permitted by law, SamKnows accepts no responsibility in respect of this document and any loss
or damage suffered or incurred by a person for any reason relying on the any of the information
provided in this document and for acting, or failing to act, on any information contained on or
referred to in this document.
Copyright
The material in this document is protected by Copyright.
Alternatively, it is possible to override test node selection based on latency and implement a
static configuration so that the Whitebox will only test against the test node chosen by the
Administrator. This is so that the Administrator can choose to test any particular test node that
is of interest to the specific project and also to maintain configuration consistency. Similarly, test
node selection may be done on a scheduled basis, alternating between servers, to collect test
data from multiple test nodes for comparison purposes.
SamKnows test nodes reside in major peering locations around the world. Test nodes are
carefully sited to ensure optimal connectivity on a market-by-market basis. SamKnows’ test
34Note that Measurement-Lab runs sidestream measurements for all TCP connections against their test nodes and
publishes these data in accordance with their data embargo policy.
infrastructure utilizes nodes made available by Level3, Measurement-Lab, Stackpath and various
network operators, as well as under contract with select hosting providers.
Furthermore, SamKnows maintains its own test nodes, which are separate from the test nodes
provided by Measurement-Lab and Level3 and Stackpath.
Table 1 below shows the locations of the SamKnows test node architecture supporting the
Measuring Broadband America Program.35 All of these listed test nodes reside outside individual
ISP networks and therefore are designated as off-net test nodes. Note, that in many locations
there are multiple test nodes installed which may be connected to different providers.
Atlanta, Georgia ✓
✓
✓
Chicago, Illinois ✓ ✓
✓
Dallas, Texas ✓ ✓
Mountain View,
✓
California
35 In addition to the test nodes used to support the Measuring Broadband America Program, SamKnows utilizes a
diverse fleet of nodes in locations around the globe for other international programs.
✓
Seattle, Washington ✓
Washington D.C ✓ ✓
✓
Washington, Virginia ✓
✓
Denver, Colorado ✓
SamKnows also has access to many test nodes donated by ISPs around the world. These particular
test nodes reside within individual ISP networks and are therefore considered on-net test nodes.
ISPs have the advantage of measuring to both on-net and off-net test nodes, which allows them
to segment end-to-end network performance and determine the performance of their own
network versus third party networks. For example, an ISP can see what impact third party
networks have on their end-users Quality of Experience (‘QoE’) by placing test nodes within their
own network and at major National and International peering locations.
Diagram 1 below shows this set-up.
Both the on-net and off-net test nodes are monitored by SamKnows as part of the global test
node fleet. Test node management is explained in more detail within the next section of this
document.
3 - Test Node Management
SamKnows test node infrastructure is a critical element of the SamKnows global measurement
platform and includes extensive monitoring in place. SamKnows uses a management tool to
control and configure the test nodes, while the platform is closely scrutinized using the Nagios
monitoring application. System alerts are also in place to ensure the test node infrastructure is
always available and operating well within expected threshold bounds.
The SamKnows Operations team continuously checks all test nodes to monitor capacity and
overall health. Also included is data analysis to safeguard data accuracy and integrity. This level
of oversight not only helps to maintain a healthy, robust platform but also allows us to spot and
flag actual network issues and events as they happen. Diagnostic information also supports the
Program managers’ decision-making process for managing the impact of data accuracy and
integrity incidents. This monitoring and administration is fully separate from any monitoring and
administration of operating systems and platforms that may be necessary by hosting entities with
which SamKnows may be engaged.
manage hundreds of test nodes and ensure that each group of test nodes is configured properly
as per each project requirement. Coded in Python, Puppet uses a low-overhead agent installed
on each test node that regularly communicates with the controlling SamKnows server to check
for updates and ensure the integrity of the configuration.
This method of managing our test nodes allows us to deal with the large number of test nodes
without affecting the user’s performance in any way. We are also able to quickly and safely make
changes to large parts of our test node fleet while ensuring that only the relevant test nodes are
updated. This also allows us to keep a record of changes and rapidly troubleshoot any potential
problems.
SamKnows maintains a standard specification for all test nodes to ensure consistency and
accuracy across the fleet.
SamKnows regularly checks its rulesets to ensure that there are no outdated rules and that the
access restriction is up to date.
SamKnows accounts on each test node are restricted to the systems administration team by
default. When required for further work, an authorized SamKnows employee will have an
account added.
5 - Test Node Provisioning
SamKnows also has a policy of accepting test nodes provided by network operators providing
that
• The test node meets the specifications outlined earlier
• Minimum of 1 Gbps upstream is provided and downstream connectivity to national
peering locations
Please note that donated test nodes may also be subject to additional local requirements.
The MBA report's methodologies account for varied user performance by focusing on median speed metrics captured during peak usage periods, representative of the typical user experience in high-demand situations . The report includes the distribution of download and upload speed metrics across different ISPs and technologies, providing a comprehensive overview of consumer experiences . Such comprehensive data allows the FCC to assess performance variability across geography and time, recognizing the distinct experiences among DSL, cable, and fiber users .
M-Lab provides a core network testing infrastructure crucial for the MBA's measurement process . By hosting measurement servers across various strategic locations in the U.S., M-Lab facilitates a standardized platform to collect unbiased performance data reflecting the ISP's true service capabilities, independent of ISP influence . This ensures that the broadband performance data is collected under controlled conditions and supports consistent and reliable testing of the ISPs, providing valuable insights into network efficiency and consumer experience .
Network congestion during peak usage periods is primarily affected by the number of users simultaneously using their broadband Internet connections, resulting in higher demand and potential service degradation . The Measuring Broadband America (MBA) program addresses this by focusing on performance metrics during these peak usage times, specifically from 7:00 p.m. to 11:00 p.m. local time . This strategy ensures that the program captures performance data reflective of high-demand scenarios, providing a realistic picture of what consumers can expect during periods of maximum network congestion .
Geographic and temporal factors significantly impact broadband performance as these vary by region and time. Variability is observed in the consistency with which ISPs meet advertised speeds across different geographic areas, with technology type influencing performance levels . Temporal aspects, such as peak usage periods, exacerbate network congestion, impacting download and upload speeds. The MBA study accounts for these factors by focusing measurements during peak hours and comparing performance across various technologies and regions, thus providing a nuanced understanding of how and when service levels fluctuate .
The hardware-based approach using Whiteboxes ensures more accurate and consistent measurement of broadband performance by focusing directly on network performance without interference from user-side variables such as device limitations or other active devices on the network . This method provides precise data collection unaffected by the heterogeneity of endpoint devices, offering a clearer picture of the ISP's service performance . It supports a more rigorous and controlled measurement environment, crucial for deriving meaningful insights into network performance and for policymaking .
'On-net' test nodes are located within an ISP's network, which can introduce bias by favoring the ISP's network performance. Conversely, 'off-net' test nodes located outside the ISP's network ensure a more unbiased comparison of ISP performance by standardizing the testing conditions across different ISPs . The MBA program primarily uses off-net nodes to mitigate biases and provide a fair assessment of network performance. However, having both on-net and off-net data allows for a comprehensive evaluation of performance degradation when traffic exits the ISP's network and ensures the integrity of off-net measurements .
Complementary cumulative distribution metrics provide insight into the proportion of users experiencing a certain level of performance relative to advertised speeds. These metrics help illustrate how many users receive service at or above specific speed thresholds, revealing performance consistency across different technologies and ISPs . For instance, the steeper curves for cable and fiber broadband compared to DSL indicate greater consistency and higher performance levels . This visualization aids policymakers and consumers in assessing the reliability of ISP services .
The program addresses privacy concerns by having panelists explicitly opt into the program and by processing personal data in compliance with relevant U.S. laws and internal policies governing privacy . The data collection and processing protocols ensure that no personally identifiable information (PII) is disclosed or stored without consent. Furthermore, detailed information consent forms are in place, reviewed by legal counsel, ensuring that panelists are fully informed and protected . These measures help mitigate privacy risks while maintaining the integrity and reliability of the study's findings .
In the MBA Report, fiber and cable technologies outperformed DSL in terms of consistency with advertised speeds. Approximately 80% of cable subscribers and 60% of fiber subscribers experienced median download speeds that exceeded advertised speeds, whereas only 30% of DSL subscribers reported matching or exceeding advertised speeds . This indicates that cable and fiber technologies generally deliver more consistent performance compared to DSL .
Software-based measurement approaches often struggle with accurately recording higher speed service tiers due to limitations in the computing platform and software capability . These methods also cannot verify if other devices on the network are active during tests, leading to potential inaccuracies in measuring true network performance under peak load conditions . These limitations highlight why the MBA program employs hardware-based methods to obtain more reliable results .