page image

CHAPTER II - Use Rights, Testing and Build-Out

Federal Agencies and Spectrum: How to Reallocate Use Rights?
Perhaps the most significant challenge in freeing up more spectrum federal agencies currently hold is to devise methods to reallocate spectrum multiple parties use for other, more efficient and valuable uses. Historically, more than thirty federal agencies used spectrum in the 1755-1850 band, sometimes quite heavily. However, they rarely took account of the actual value of their spectrum in their procurement and usage decisions, said Lawrence Strickling of NTIA. This is largely because agencies have regarded spectrum merely as a communications tool to perform some statutory mission; they do not really have any incentive to experiment with new technologies or innovate with unused portions of a spectrum band. OMB Circular A-11 requires agencies to take the value of spectrum into account when making procurement decisions.

“Basically, agencies want two things with respect to spectrum,” said Strickling: “They want to know that they will have the resources to figure out the implications of various spectrum proposals, and they want to know that any outcome will not be rammed down their throats.” The Spectrum Pipeline Act of 2015, which provides $500 million to help agencies conduct spectrum decision making, helps, but they could use more money in assessing how to use spectrum more efficiently. The pressure will only become more intense as the process for reallocating spectrum now held by federal agencies, proceeds.

Peter Tenhula, Deputy Associate Administrator for Spectrum Management at the NTIA, noted that Congress first directed agencies to reallocate spectrum from government to non-governmental players in 1993. Before the Pipeline Act, NTIA starting to see “a retrenchment of agency interests and a stiffening of resistance” in some agencies. So while federal agencies have always been keen to assert their interests, the pressures surrounding agency usage rights of spectrum has intensified.

That said, Strickling believes “we’re undergoing a cultural change within agencies.” In a paradoxical way, “growing demand for spectrum could be helpful, not harmful, to this situation,” he said. Richard Bennett of High Tech Forum speculated that “it may motivate greater cooperation among agencies in how they will use spectrum, and also greater cooperation between private and public sectors.” Because government-side demand on spectrum is growing as much as private sector demand is, there seems to be a mutual awareness that everyone needs to be more flexible and consider spectrum-sharing innovations.

The Federal Aviation Administration is one innovator that is proposing research and development projects that would potentially free up spectrum, with the expectation that it could receive some portion of funds from the auction of that spectrum to fund its own infrastructure needs. As John Leibovitz explained, the FAA would like to combine many old radar systems into a single, unified radar system using new technologies. But first it must be able to test the system to see if it works. If the new system works, the FAA could reap billions of dollars of money from a spectrum auction and use those funds to buy a new radar system that would improve its radar effectiveness—a win-win scenario.

“The FAA is a prototype example,” said Leibovitz. It shows how to provide incentives to agencies to improve the efficiency of their spectrum usage, or at least eliminate the disincentives to it. “Agencies don’t necessarily want more spectrum,” he explained. “They want more capabilities. If we give federal agencies the freedom to think about these things, it could stimulate greater sharing and access to spectrum.” So how to move this idea forward? Paula Boyd, Director of Government and Regulatory Affairs for Microsoft, noted that “a lot of new sharing mechanisms have been introduced in the spectrum space in recent years. It would be good to assess what has worked, why, and in what context.”

To help provide a broad overview of how spectrum policy might move forward, Blair Levin, Senior Fellow at the Brookings Institution and one of the key architects of the National Broadband Plan, made a presentation outlining what he has learned.

The key premises for any plan going forward should be to promote abundant bandwidth and avoid inefficient or non-uses of spectrum. The prime barrier to abundant, efficient use is a practical system for reallocating spectrum. However that occurs, a national broadband plan should include regimes for shared, licensed and unlicensed spectrum. Reallocating spectrum can be extraordinarily difficult, however, because the respective timetables for the regulatory process, technology development and markets at a global scale, are not in sync, said Levin.

Since the government is the legal owner of spectrum, it could in principle just take back unused or underused spectrum. But this presents a variety of litigation risks and political resistance, which policymakers have sought to avoid by instead using “incentive auctions,” as described above. If a new market entrant sees enough opportunity that it is willing to pay for the spectrum, the incumbent rights-holder can vacate the spectrum band, enabling its reallocation. Spectrum rights have also been transferred through other means, such as invoking eminent domain, overlay and underlay auctions and bounty rights. Through such means, spectrum once held by satellite services moved to terrestrial services, and broadcast spectrum moved to mobile broadband. It remains unclear which types of providers may ultimately acquire spectrum government agencies currently hold. In any case, none of these transfers of spectrum has been fully accomplished, said Levin. But the transfer of government-held spectrum to new entities will pose a major challenge for the new administration.

Levin outlined possible tactics for reallocating government spectrum. They include:

  • a government incentive auction;
  • a GSA for spectrum (similar to the GSA’s management of government real estate);
  • private sector bounties;
  • plans for staged reductions of spectrum use (similar to military base-closing commissions);
  • budget numbers to assign a value to spectrum;
  • a fund to finance spectrum relocations and upgrades; and
  • spectrum sharing.

The task of reallocating spectrum raises many new questions: The first is whether to reallocate spectrum to shared, unlicensed or licensed uses, said Levin. The government must also determine whether spectrum should be allocated band-by-band, under long-term plans or through pre-set allocations. If spectrum is to be shared, then there must be incentives for incumbents to invest in shared usage scenarios, and new management and measurement systems, not to mention processes for complaints and remedies if sharing protocols are violated. No one wants to make big investments in a sharing regime if they will not get adequate returns from it, so ensuring that the ecosystem of spectrum is fully developed—and that returns are thereby generated—is also important.

While dealing with these novel policy questions, many traditional questions remain, said Levin. Should spectrum be allocated to broad, flexible uses or to specific needs? (The proliferation of potential uses such as autonomous vehicles, drones, Internet of Things, and new government uses makes this question more difficult.) How can the spectrum used for different environments be protected? (This requires better interference test metrics, a process for such testing, and a process for complaints and remedies.) And how to incentivize companies in all environments to invest in the new spectrum regime? (Given asymmetric entry points to networks, who will bear what costs, and will the incentives favor licensed over unlicensed spectrum?) Levin also noted that while shared spectrum regimes hold promise, they have not yet been proven to work at scale.

Levin noted that there are other challenges: Spectrum holders must be induced to adjust their use in a fair and timely way. Sharing facilities to make 5G viable must be enabled without hurting competition and investment incentives. Finally, the political expectations of legislators (to raise money for the federal budget) may not be aligned with the policy goals (to provide incentives for efficient spectrum reallocation and competition).

The Need for Better Testing and Evaluation
Levin’s presentation stimulated a deeper discussion about testing and evaluation of spectrum uses. “Interference analysis is a key challenge,” said Dennis Roberson, Professor at Illinois Institute of Technology. “It would give us the testing to name and identify what harmful interference really means, technically, in a sharing context. There is no real agreement about that right now in most cases. We’ve got to establish and test these criteria in a rigorous way instead of having warring tests with different criteria and results, often based on folklore.” Roberson stressed that this is essential as spectrum uses become “crammed more closely together.” It will also be essential in establishing criteria for regulatory enforcement.

It would be particularly useful to have actual, on-the-ground data about real spectrum usage to improve the quality of future spectrum policy, said Roberson: “We don’t really know how spectrum is being used. We theorize far too much. We need better instrumentation, particularly in cities, so we can know definitively what the opportunities are.”

However, interest in this issue has is increasing dramatically; the National Science Foundation held a workshop on spectrum measurement in the spring of 2016, for example. The hard part is not installing sensors, said Roberson, but “analyzing terabytes of data in meaningful time-frames to meet operational needs. It is a huge Big Data problem.” He explained that while many entities collect data, it is generally different types of data captured in different formats, and does not necessarily include metadata about the context of the data (when, who, where, etc.), which is needed to properly assess the data that is captured. Roberson also explained that there is no standard methodology for evaluating the data that is collected.

While “everyone using a network is already measuring utilization as an operational necessity,” said Richard Bennett of High Tech Forum, “any assessments by outsiders are necessarily partial. That’s because the data is in the hands of operators, including measurement chips in electronic devices and people with Wi-Fi access points in their homes. The data is there, but it’s highly distributed and privately held. There are enormous issues in identifying these sources and de-identifying the data. And who is going to pay for aggregation and analysis of the data? That’s the issue.”

To complicate matters further, there are often sensitive national security concerns for certain spectrum bands, which trump all other concerns. For example, spectrum bands used by the Secret Service when the President comes to a city will see a spike in usage. That data cannot and should not be published, though it may be accessible over the airwaves. This is “an overhang on policy,” said Lawrence Strickling. “All the measurement in the world won’t solve those kinds of problems.”

What might be the most appropriate institution to move a testing and evaluation agenda forward? And how would it be established and financed?

Lawrence Strickling of NTIA argued that “industry has to get serious and put money into testing, especially for spectrum sharing.” Strickling’s NTIA colleague, Peter Tenhula, noted that there is no wireless industry equivalent for CableLabs, which does testing and evaluation of cable television standards. He noted that the Center for Advanced Communications (CAC) in Boulder, which the NTIA and the National Institute for Standards and Technology established, serves as “an honest broker for testing and evaluation, but it is still in its infancy.” At the moment, other than CAC, there are no good places to provide neutral, disinterested evaluations, he said. “More resources for that would be great.”

Some participants argued that individual standards-setting committees ought to reflect a broader diversity of players, including economists and policy experts, and not be populated only by specialists from relatively ingrown technical fields. Technical standards sometimes amount to “the propagation of religious rites,” said one participant, which can result in unrealistic and narrow technical standards that block potentially valuable uses of spectrum for other services that are often either not known, or if known not understood by the members of the existing standards body.

There was some disagreement over whether an authoritative testing lab ought to be an industry-supported entity or a government lab. One set of voices argued that a government lab has too many budget uncertainties and time delays in performing analyses. This is a major issue for industry, given the fast pace of technological innovation. On the other hand, said others, an industry-supported lab may not have the impartiality and credibility, and may or may not be any better at producing speedy results.

Nonetheless, some participants noted that industry has a growing motivation to develop a greater capacity to test and evaluate technical standards. The eclectic industries now emerging around the Internet of Things realize that they will need access to spectrum if their sectors are to grow; its conferences nearly always have panels on spectrum policy these days. Similarly, consumer-oriented tech companies are trying to anticipate complex regulatory impediments and spectrum policies that might delay new product introductions, said Nicol Turner-Lee, Vice President and Chief Research and Policy Officer for the Multicultural Media, Telecom and Internet Council. Resolving debates about technical performance standards could help speed up the regulatory process and ensure more trustworthy, credible results—and thus minimize the impact on consumers.

How to Encourage Spectrum Build-Out
One shifting challenge in today’s telecommunications environment is how to encourage licensees to make the greatest possible use of their spectrum. In the 1990s, it became clear that the geographic size of licenses mattered. This did not necessarily mean that a licensee made a 100 percent build-out of its spectrum. For example, Verizon reaches 98 percent of the US population by providing service to a large portion of the geographic land mass. It is harder and more expensive to reach the last 1 or 2 percent of the population because they tend to be in sparsely populated regions (about 70 percent of the US population resides in about 40 percent of the land mass).

In terms of maximum build-out for new spectrum allocations, “that can be hard to do when you don’t know what you’re building out,” said Charla Rath of Verizon Communications. Operators will be building 5G networks in many areas where people do not live. So if the FCC uses a population metric to determine service, an operator would get no credit for providing mobile services in locations with high densities of transient crowds, but no residences, such as big transportation centers or sports stadia.

Michael Calabrese of New America believes that “we need to be thinking very differently about build-out requirements. It made sense to have coverage requirements for spectrum because that was the public policy goal. But as we move up to higher frequencies, it doesn’t make as much sense,” because usages will be more specialized. An extreme instance, he said, would be the use of millimeter waves, in which build-outs would naturally target high-traffic locations. This issue may be somewhat moot in any case because build-out requirements have not historically been rigorously enforced, he said.

Calabrese said it may make more sense to re-auction spectrum every few years “to avoid fossilization of spectrum usage. Secondary markets have not proven very effective in reallocating spectrum.” Re-auctions could also avoid the need for incentive auctions. But Jonathan Chaplin of New Street Research argued, “You could reach the same objective by getting rid of restrictions on how spectrum may be used—allow flexible use. Then secondary markets could work more flexibly and effectively.”

Enforcement of build-out requirements may be problematic for political reasons, suggested Blair Levin of the Brookings Institution. “Here’s something you never read in the books: As soon as the government starts to enforce build-out requirements, the response of the licensee is always, ‘Oh, I was just about to build out in [state where the current Senate majority leader comes from].’” The penalty for failing to build out is an automatic revocation of the spectrum license. That’s a very strong deterrent, said Levin, but that’s also the reason why it is never invoked. It is “an absolutely destructive weapon.”

Are New Policymaking Approaches Needed?
While the discussion did not propose any specific new policymaking structures, many participants noted the limitations of the legacy system for making spectrum policy. Andrew Clegg, Spectrum Engineering Lead at Google, suggested that current regulatory regimes that separate spectrum only into licensed and unlicensed uses, may make it difficult to regulate access to spectrum. It may be necessary to think of ways to overcome the friction that this introduces, he said, such as more flexible allocations, especially in light of 4G/LTE network technologies and Wi-Fi. Charla Rath, Vice President, Wireless Policy Development for Verizon Communications, made a similar point, noting that “newer users of spectrum tend to be more flexible [than existing ones], and are more willing to share a band, but the regulatory structure can be too rigid to accommodate innovative use cases.”

Lawrence Strickling of NTIA agreed that an unanswered challenge in the years ahead is “how to devise policy structures that are flexible enough to deal with rapid technological change and unanticipated uses. As a principle, we must keep the capacity to allow different uses and innovations over time—to keep the door open for new ideas.” The policy framework must be capable of evolving, he said.

The rise of new models for accessing digital networks underscores the need for a flexible policymaking framework, said Michael Calabrese of the New America. “Users don’t necessarily need a traditional terrestrial cellular network,” he said, noting the emergence of high-capacity satellite networks, HALO, and small-cell wireless networks feeding off high-capacity wireline connections that can tailor access to the needs of a particular area. [HALO, or High Altitude, Long Operation Wi-Fi, consists of networks that will use aircraft at about 52,000 feet above cities to provide broadband service to 60 square-mile areas.] Conversely, Calabrese added, innovations such as high-altitude broadband platforms (e.g., drone aircraft) and meshed satellite constellations are making wired networks less critical for providing basic connectivity.

Share On