page image

CHAPTER I - Spectrum Challenges Then and Now

In 2009, Congress asked the Federal Communications Commission to develop a National Broadband Plan. In reviewing the landscape, the Plan team determined that the revolution in mobile digital technologies—smartphones, tablets and other devices—was soon going to create a scarcity of electromagnetic spectrum. To deal with the imbalance of surging demand for finite spectrum capacity, the Plan made several recommendations for increasing access to spectrum for broadband services. These included a two-sided incentive auction to encourage broadcasters to sell their spectrum to users who would put it to greater economic use, chiefly mobile carriers and other innovators. In addition to making more spectrum available for licensed use, the Plan recommended greater license flexibility, sharing of specific spectrum bands and designation of a greater amount of spectrum for unlicensed use.

Since the release of the Plan, the technological and market landscape has changed, creating a new set of spectrum policy challenges for the FCC. Imbalances in spectrum supply and demand have a somewhat different character today than in the past. While the previous focus had been on how to allocate scarcely used spectrum, the new focus is on how to reallocate inefficient uses of spectrum to more urgent and innovative uses. In addition, the rise of new technologies, such as the Internet of Things (IoT), autonomous vehicles, drones and personal monitoring devices, need to be considered. These will all require greater access to spectrum, but current regulatory processes for facilitating shifts in spectrum usage need to be studied and improved. While addressing such needs, the FCC needs to keep an eye on advancing the public interest goals of the Communications Act and collaborating closely with a variety of stakeholders to preserve mission-critical, safety needs.

To assess these challenges and propose possible solutions, the Aspen Institute Roundtable on Spectrum Policy (AIRS) met for two days at the Aspen Wye River Conference Center in Queenstown, Maryland, from October 23 to 25, 2016. A diverse group of twenty-five researchers, technologists, regulators and spectrum policy experts from industry, academia and nonprofit groups reviewed the history of the National Broadband Plan while discussing how spectrum policy might be adapted to address new technological, marketplace and social realities. The group’s intensive discussions resulted in a set of new policy recommendations for reallocating and improving spectrum usage in the coming decade.

Charles M. Firestone, Executive Director of the Aspen Institute Communications and Society Program moderated the sessions. This report, by rapporteur David Bollier, provides an interpretive synthesis of the key themes and points of discussion at the conference.

Allocating Spectrum
The Successful History of the National Broadband Plan and the Unmet Challenges
Future efforts to improve allocations, designations and use of spectrum must be understood in the historical context of the past eight years. The 2009 stimulus package—formally, the American Recovery and Reinvestment Act—directed the FCC to create a national plan to improve the deployment, adoption and use of broadband in the US. The plan, released in March 2010, set forth in Chapter 5, a comprehensive strategy to “ensure that there is sufficient, flexible spectrum that accommodates growing demand and evolving technologies.”

An historic convergence of two disruptive technologies—the Internet and mobile communications—prompted the broadband plan. The Obama administration, industry and consumers wanted to ensure that the new technologies could continue to expand and facilitate economic growth and opportunity while also ensuring broad access and affordability for consumers. By facilitating the use of mobile and digital technologies, the plan also sought to enhance healthcare, public safety, community development, education, and energy independence, among other national purposes.

To review the seven-year history of the National Broadband Plan—what we learned from it and what changes may be necessary in the future—John Leibovitz, the former Deputy Chief of the Wireless Bureau and Special Advisor to the Chairman for Spectrum Policy at the FCC, made a brief presentation.

One important element of the National Broadband Plan, Leibovitz noted, was to create a market for spectrum currently allocated for broadcast use so that market forces could drive the reallocation of that spectrum for wireless broadband purposes. This was a novel policy shift at the time, spurred by surging demand for spectrum by mobile telephones, tablet computing and other wireless devices and the lack of spectrum supply. 4G wireless networks were soon to be launched, and it seemed clear that new steps would be needed to assure adequate spectrum.

At the time, supply of spectrum for mobile broadband was quite limited, said Leibovitz. There was a major auction of spectrum in 2006 (Advanced Wireless Services 1 spectrum, Auction 66) and another one in 2008 (the 700 MHz band, Auction 73), both of which had taken years of planning and organization. “So while it was clear that mobile broadband was taking off, especially data demand,” he said, “it was not clear how in terms of spectrum the FCC should deal with this growth. That was the problem we were addressing.”

To define the needs and possible solutions more precisely, an inter-agency team that included the FCC, the National Telecommunications and Information Administration (NTIA), and the White House, among others, was created. The team assessed the situation and set goals, including a goal of making 500 megahertz of spectrum available. Leibovitz said that the team took a practical, heterogeneous approach, drawing upon many people’s ideas. “We knew there was no magic bullet,” he said. “We did a bottom-up analysis of supply and demand, and tried to come up with achievable goals.”

Leibovitz said, “The idea for an ‘incentive auction’ came out of that process”—the idea that broadcasters could be incentivized to voluntarily relinquish spectrum usage rights in exchange for a share of the proceeds from an auction of new licenses to use the repurposed spectrum. The goal was to use market forces to help reallocate spectrum rights, making more wireless broadband available to telecom carriers and easing wireless congestion for consumers.

The key to the success of the incentive auctions, said Leibovitz, was a careful consideration of which constituencies must be part of the plan and a precise rendering of the problem that needed to be solved. The idea for auctions also included a timeline for action; a clear assignment of roles for agencies such as the FCC, NTIA and international bodies; and a well-designed implementation process. “If we are going to revisit and recalibrate plans for reallocating spectrum in the future,” he said, “we need to follow these steps again.”

Lawrence Strickling, then-Assistant Secretary for Communication and Information at the NTIA, echoed this assessment, adding that the National Broadband Plan worked because it set a specific numerical goal for spectrum transfers. Summarizing a presidential memorandum issued a few months later, Strickling said that the President “directed the NTIA to work with the FCC to develop a roadmap and create a pipeline of spectrum opportunities.” The goal was originally to free up 500 megahertz of spectrum, said Strickling; the current pipeline seeks to free up another 130 megahertz of spectrum. The FCC and NTIA facilitated the whole process by providing a descriptive inventory of the various bands of spectrum, for both government use and commercial opportunities. NTIA also committed itself to a transparent decision-making process and the release of reliable information to inform the entire process, he said.

The Distinctive Challenges for Spectrum Policy Today
Participants broadly agreed that the Broadband Plan provides many important innovations that ought to inform future policymaking, yet it is also clear that circumstances today are different and arguably more complicated than they were in 2009.

John Leibovitz wonders whether spectrum rights today should be handled within a broadband policy framework at all, largely because emerging technologies—the Internet of Things, autonomous vehicles and drones—present such novel and complicated usage issues. On the other hand, he noted, “There was not much of a spectrum pipeline for policymaking and implementation in 2010; that apparatus is there now. There is an ongoing set of proceedings and precedents for asking and addressing questions. The challenge remains: How to come up with ‘win-win’ projects that make everyone better off in some material way?”

If the major problem in 2010 was how to shift spectrum from broadcasters to wireless broadband networks, the more significant problem today is how to reallocate spectrum assigned to federal agencies and to enable more efficient sharing of bands of spectrum. “The challenge now is how to use the policy tools that we have—of which there are many—and get federal agencies to see opportunities to convert some spectrum into money, which they could use to buy next-generation systems to expand their capabilities,” said Leibovitz. The Federal Aviation Administration and Department of Defense, for example, have considerable spectrum that could be shared or reassigned, as do several science services and agencies.

In addition to reallocating spectrum, the challenge today is also about how to adjudicate among quite varied services that are seeking spectrum, said Leibovitz. While mechanisms exist for dealing with spectrum allocation within a single industry, he said, we do not currently have systems for resolving competing demands for spectrum when there are diverse players. “You see this in 5G networks with satellite and terrestrial services, and domestic and international networks. Engineering solutions [that enable spectrum sharing] are one type of solution. But they can be a brittle approach for dealing with long-term problems” involving spectrum allocations.

Spectrum policymaking is more complicated today, said one participant, because in an environment in which the same spectrum bands are used for multiple purposes, “everything is more crowded.” This makes it more important to clarify people’s expectations and devise effective enforcement mechanisms. There must be clearly defined rights and reliable ways to resolve disputes. But it may be possible to bypass some of the familiar regulatory conflicts such as the either/or dichotomy of “licensed spectrum vs. unlicensed spectrum”—“by moving to a system of dynamic sharing of spectrum,” said Michael Calabrese, Director of the Wireless Future Project of the New America. This may be the solution most suited to meet the needs of drone users, for example.

Similarly, policies directed at receivers, not transmitters, may offer new solutions because they could enable stipulated forms of spectrum sharing without violating the exclusive rights of licensees. “It would be interesting to think about a specific band or opportunity” to experiment with such a scheme, said Blair Levin of the Brookings Institution. “Could we prototype a new system within the existing system? That might be hard to do, but it is worth exploring.”

Participants saw other ways in which today’s technological and market circumstances make spectrum policymaking more complicated than in 2010.

Michael Calabrese of the New America noted that an original goal of the National Broadband Plan was universal coverage. Now there is a greater need for improving the capacity of spectrum bands. He also noted that policy previously treated fixed, landline telecommunications differently than mobile. Now fixed telecommunications is feeding mobile usage, and the two realms are converging. Finally, said Calabrese, the feasibility and appeal of dynamic spectrum sharing has soared, opening the door to new types of technology-driven solutions for expanding spectrum usage.

A critical impetus for this change was a 2012 report that the President’s Council of Advisors on Science and Technology (PCAST) issued, which proposed a shift from single-use allocations of spectrum to sharing spectrum regimes. Among PCAST’s proposals was the creation of 1,000 megahertz-wide “spectrum superhighways” and the regulation of receivers to prevent spectrum interference. Once PCAST broached these possibilities, it opened the door for a larger conversation. It became clear that many spectrum bands are grossly under-utilized—an insight that made it fair game to propose that all spectrum bands become more efficient. “How to get more spectrum is not the problem so much as how to use what we have,” said Jonathan Chaplin, Managing Partner of New Street Research, noting that unresolved policy issues could help make spectrum more available.

The many recent changes in the tech landscape could make it more complicated to devise effective, suitable regulatory processes to improve spectrum usage. For example, noted Blair Levin, “One of the most important things for spectrum usage, oddly enough, is not spectrum, but what’s in the ground—fiber cables—because fiber is needed for backhaul connections to telecom networks. Policies for unlicensed spectrum could also help open up more access to spectrum.”

However, any policies are likely to run into different technical barriers and the varying developmental speeds of different technologies. Richard Bennett, Founder and Editor of High Tech Forum, emphasized that TCP/IP—the technical protocols of the Internet—“are not a good set of protocols for spectrum sharing. Yet we won’t have the full benefits of 5G wireless networks until we get the Internet to work with them. These issues are connected. For example, we need better Wi-Fi access points to improve Wi-Fi performance so that users won’t drop packets and have delays in TCP/IP transmissions. We need to figure out how to stream more data, faster, with these protocols over whatever spectrum is being used.”

In a brief presentation, Lawrence Strickling discussed the “unfinished business” of the National Broadband Plan, drawing upon lessons from the past seven years and new technological realities.

Crafting a coherent regulatory approach will be difficult going forward, said Strickling, because different emerging technologies are developing at rapid but differential speeds: “Smart grid technologies are moving slowly right now, but other, like autonomous vehicles, are developing faster. Whatever applications our policies focus on today won’t be the right set of apps or uses in three, four or five years,” he said. “So we need to make policy structures flexible enough to enable unanticipated uses in the future. The National Broadband Plan was originally meant to address the spectrum crunch, but now it’s being used to manage 5G infrastructure. That’s the challenge—devising an evolutionary policy framework.”

Strickling also believes that the Plan does not address how to manage spectrum sharing and orchestrate new usage regimes for federal agencies that depend upon spectrum. One useful step in this direction would be for the FCC and NTIA to conduct a holistic “interference analysis” to determine what boundaries are needed to prevent unacceptable interference in specific bands of spectrum. “There are lots of parochial concerns being aired and debated, but we can’t set policy that way,” Strickling said. Related to this issue is the need for an effective, credible enforcement regime.

Other issues have risen to the foreground that were not present in 2010: the emergence of 5G wireless as a regulatory issue; the changing ways that spectrum for public safety purposes can be used, perhaps in conjunction with commercial uses; the need for appropriate cybersecurity measures, especially as the Internet of Things expands; and real-time data analysis to intelligently define and address the problems to be solved.

Cybersecurity is a particularly vexing problem in a wireless context because wireless makes the scale of the problem radically larger. “Wireless is different [than wired, dedicated applications] because it is intrinsically open,” said Dennis Roberson, Research Professor of Computer Science at Illinois Institute of Technology. “Anybody can connect in, and it is easier to deploy, so the system is much more vulnerable. The ‘attack plane’—that is the number of places where access to the network is possible—is much larger with wireless.” It is quite true that cyber-attacks are global in nature and not confined to wireless systems, said Roberson, but wireless networks enhance a criminal’s or a state actor’s ability to penetrate the network to intercept or disrupt communications and/or hack computers connected to the network. That said, the group agreed that this conference devoted to spectrum policy could not adequately deal with this vast issue.

Share On