Proving a Negative

The RUS has new funds to spend on connecting unserved areas. But how can applicants prove an area is unserved?

  • Law and Policy
  • Rural Broadband

The Rural Utilities Service must decide where to allocate funds for its new rural broadband pilot program. Congress determined that the funds be used only in areas unserved by broadband, defined in the first year as areas in which 90 percent of the households lack 10 Mbps/1 Mbps service. Prioritizing spending to areas most in need is good policy, and if the government published a list of such areas, the RUS could direct applicants to that list.

Alas, no such list, no such data, no such maps exist. In the words of Sen. Jon Tester, D.- Mont., “The maps stink” – by which he means the National Broadband Map overstates service coverage and is generally unreliable.

If the RUS relies on the National Broadband Map and excludes all areas in which federal money is already being spent or an ISP claims 10 Mbps/1 Mbps service, only about a quarter million households in the country will qualify for the RUS program. This strains credulity.

It is left to the RUS and potential applicants to develop methods to determine which areas lack 10 Mbps/1 Mbps service availability to at least 90 percent of the households. As Congress has directed the RUS to coordinate its broadband activities with the FCC, focusing on applying FCC rules and protocols to the RUS broadband program may be helpful.


The FCC does not rely solely on the National Broadband Map when it spends the public’s money on broadband networks, and neither should the RUS. The FCC has long used its own testing program, Measuring Broadband America (MBA). Earlier this year, the FCC adopted a testing protocol (https://docs.fcc. gov/public/attachments/DA-18-710A1.pdf) to ensure that the public’s funds are spent on actual, consistent, measurable speeds. The National Broadband Map and the underlying Form 477 data, by contrast, reflect self-reported, advertised speeds. (Facilities-based broadband providers file Form 477 semiannually to show where they provide service.)

Having spent years analyzing 477 data and MBA data, I can say with some confidence that where 10 Mbps/1 Mbps is the maximum advertised speed, testing will show that the service would not meet the FCC’s testing protocol for 10 Mbps/1 Mbps under the Connect America Fund. That may sound contradictory, but when the 477 data says 10 Mbps/1 Mbps is the maximum speed, it generally means the actual, consistent, measurable speed is less than 10 Mbps/1 Mbps.

This distinction is the whole enchilada. Maximum advertised speed is typically a ceiling; the FCC’s testing protocols ensure a floor. Congress offered no judgment on ceilings versus floors in the broadband legislation. However, given the attention it recently paid to the inadequacy of the National Broadband Map, Congress is unlikely to want the RUS to use theoretical, sporadic, untested broadband speeds. The RUS should adopt actual speeds, rather than advertised speeds, as the standard for determining whether areas are served or unserved. 


The FCC does not use speed in isolation as a measure of broadband, and neither should the RUS. Speed alone does not define broadband. A speed of 10 Mbps/1 Mbps with only a 10 GB per month data cap is not broadband. The average household use of broadband has grown a hundredfold over the past decade to 200 GB per month.

Similarly, 10 Mbps/1 Mbps with a latency of 750 milliseconds is not broadband because voice and other key broadband applications require low-latency networks. Nor is 10 Mbps/1 Mbps broadband if it is priced at $100 per month. Rural households are generally more impoverished than the national average, and putting the price of an essential service out of reach effectively means it is not available.

Speed, capacity, latency and affordability are all components of the FCC’s requirements for spending public funds on broadband networks. In the recently concluded Connect America Fund auction, the baseline tier for bidding included 25 Mbps/3 Mbps speed, 160 GB per month capacity, 100 milliseconds or better latency and a price point no greater than the average price for such service in urban areas. The optimal tier in the auction (for which there was no bidding penalty) was 1 Gbps, 2 TB, 100 milliseconds and the average urban price. Similarly, the RUS should indicate that an area is unserved if the internet services available fail any of those components.

The RUS could adopt a definition of “unserved” that includes all four components. For example, served could mean 10 Mbps/1 Mbps or faster speeds, 160 GB or greater monthly capacity, 100 milliseconds or better latency, priced at $55 per month or less.


The FCC’s statutory mandate is universal service, and the RUS programs should adopt universal service as their goal. In some cases, the FCC’s Connect America Fund program obligation calls for reaching just 95 percent of households, but that is still markedly different from service as displayed in the National Broadband Map. The map displays a census block as served by a particular speed if an ISP reports availability to even a single location in that census block. When the underlying technology is distance sensitive, as is the case particularly with copper- and spectrum-based services, the mapping significantly overstates availability. For example, a household 2,000 feet from a telephone company central office or DSLAM might receive 10 Mbps/1 Mbps speeds, but a household in the same census block 10,000 feet from that same central office or DSLAM assuredly will not. The map treats both households as served.

Treating partially served census blocks as if they are fully served will result in an underinvestment in rural America. The appropriate geographic area is the home, not the census block, but data isn’t collected on such a granular basis. This is a real and difficult problem. When the state of New York designed its broadband program, it surveyed ISPs about partially served census blocks and found 25 to 50 percent additional unserved households. However, no such data exists on a national basis.

One way to address the problem of partially served blocks is to use a safe harbor assumption – for example, to treat all partially served blocks as if they were 50 percent served. For more granular evidence, the RUS could adopt a sliding-scale safe harbor between 10 percent and 90 percent of the households in a census block.


Another criterion – one the FCC doesn’t use – is evidence of consumer decisions. If, by means of surveys or other outreach, it becomes evident that fewer than 10 percent of the households in an area subscribe to an ISP’s service, the RUS should count the area as unserved. Congress has proscribed funding to areas where broadband is available to more than 10 percent of the households. It would be perfectly reasonable for the RUS to include consumer decisions as one of the demonstrations that an area is served or unserved. 

If an ISP’s service is so poor and unavailing that fewer than 10 percent of households subscribe to it, consumer decisions should be prima facie evidence that broadband is unavailable. Can you think of even one other essential service to which fewer than 10 percent of households subscribe? Can you imagine fewer than 10 percent of households subscribing to electric service? Failure of households to subscribe to a service should be a sufficient demonstration that the National Broadband Map is wrong and service is not available. Let’s not rely on the 477 data in that circumstance. Instead, let’s take the word of the local residents.


If the National Broadband Map can’t be relied on, applicants to the RUS broadband pilot program need a procedure to demonstrate that the areas they propose to serve are currently unserved. I suggest they take the following steps:

  1. Identify the census blocks in the area and download the FCC’s 477 data for those census blocks. The census blocks in which no ISP claims 10 Mbps/1 Mbps service count as unserved.
  2. For census blocks in which an ISP claims 10 Mbps/1 Mbps service, identify the provider(s) and check whether their service offerings meet all four criteria for broadband. If no ISP meets all criteria, count the census block(s) as unserved.
  3. For those census blocks in which an ISP claims 10 Mbps/1 Mbps and the service offering meets the broadband criteria, employ the FCC testing protocols adopted July 6, 2018, specifically with respect to the time of testing, frequency and number of tests to be performed. The tests should be performed at peak times at the locations of subscribers to ISPs that claim 10 Mbps/1 Mbps service. The FCC requirement for the number of test locations depends on the number of ISP subscriber locations, as shown in Table 1.
  4. Using Table 1, determine the number of test locations based on the number of subscribers in the area that will be tested. It may be difficult to get data on the number of an ISP’s subscribers. Though the FCC collects such data, it does not make the data public, except in the aggregate by technology. One could make a good faith assumption that the number of subscribers within the test area is proportionate to the aggregate data (see, for example, As this number is being used only to establish the number of test locations, such an estimate would be appropriate. If the FCC or the ISP were to make the individual data publicly available, ISP-specific data could be used.
  5. Identify test locations for subscribers at the requisite service level. This step will be the most difficult because, despite the 477 data, there may be few, if any, such subscribers. If a good faith effort to locate subscribers fails to turn up enough to conduct the requisite tests, the area should be considered unserved.
  6. Conduct tests from a sufficient number of test locations to determine the percentage of tests that meet or fail to meet the 10 Mbps/1 Mbps standard. If 90 percent or more locations fail, the area should be considered unserved.
  7. If fewer than 90 percent of the locations fail, use the percentage of failures to calculate the number of unserved households and adjust the area in which the applicant is seeking funding. For example, locate the DSLAMs in an area and determine whether successful tests are at households closer to a DSLAM than failed tests. If so, adjust the proposed application to account for the distance sensitivity of the ISP’s technology.
  8. Prepare an application that can show through a variety of tests and measures that 90 percent of the households in the area lack broadband.
  9. If an ISP contests an application as including an area already served, the RUS should require testing specific to the area and at least as rigorous as the FCC testing protocol. If the National Broadband Map no longer carries a presumption of service, contrary evidence gathered by an applicant should shift the presumption. Furthermore, the ISP’s tests should be made available to the applicant and to the community. To the extent that an ISP can demonstrate the requisite service to specific areas in the application, those areas should be removed.
FCC Testing Protocols
Table 1: FCC Testing Protocols

My perspective on the National Broadband Map comes from time spent working with the data inside the FCC and outside the FCC from the time at which the FCC stopped using the State Broadband Initiative data and switched to the 477 data. The SBI data was inconsistent, and the 477 data is unreliable. Policymakers should not use data they know to be unreliable without allowing for challenges to the data. The steps I have outlined recognize the shortcomings in the data and permit a targeted approach for the RUS broadband pilot.


Read what others have to say, and share your own thoughts with the community.

2000 characters remaining

© 2023 Broadband Properties, LLC

Privacy Policy

Web Design and Web Development by Buildable