page image

- Strengthening Democracy Through Technology

The Commission Finds:

  • The rise of the internet, and especially its social media platforms, has connected diverse populations worldwide, expanded opportunities for free expression and enabled new forms of civic engagement. At the same time, the impact of social media platforms and major technology companies on our news and information ecosystems demands a re- examination of the roles of technology providers in a democracy.
  • While not the only cause, the rise of the internet has deeply disrupted the way Americans find, consume and talk about news.
  • The spread of misinformation and disinformation on the internet, foreign interference in U.S. elections and the abuse of social media platforms and their powerful targeted advertising tools by bad actors have fostered uncertainty about the reliability of online information. To rebuild user trust, major technology companies need to more actively and transparently combat problems such as disinformation, hate speech and other divisive content.
  • Users’ fear of losing control over personal information, and particularly of having it disseminated to unknown third parties, can lead to reduced trust in the disseminating entity. When that entity, whether an online social platform, digital portal or media outlet, has been a relied-upon source of information, it leads to reduced trust in media generally.
  • In seeking solutions to the unique challenges of the internet ecosystem and the loss of trust in America’s institutions, the following values are crucial: a platform-agnostic (even a technology-agnostic) approach; recognition of the continuing evolution of technology; retention of the best principles of openness, inclusion and free expression; and ensuring that the design and management of social media platforms align with democratic values.
  • In several areas of concern, tensions between potentially conflicting values (e.g., between the interests of individual users and the interests of the community, between freedom and responsibility) need to be thoughtfully addressed.

The Commission Recommends:

Recommendation 5 RESPONSIBILITY
  • Technology companies and online services that collect user data should become information fiduciaries, with duties to the user.
Recommendation 6 TRANSPARENCY
  • Technology companies and online services should embrace transparency by providing more information about the impact of their advertising tools, the source and sponsorship of content online and the role that algorithms play in the flow of news and information.
    1. Support the development of tools to trace the origin of news stories and other online information.
    2. Disclose funding sources for online ads.
    3. Provide end users with information about how algorithms work and access to customized algorithms and news feeds.
Recommendation 7 INNOVATION
  • Invest in new structures and technology-based solutions to address emerging problems.
    1. Develop techniques to discourage sharing of disinformation and/or anti- social content.
    2. Use technology and collaboration to help defeat disinformation.
    3. Provide for data portability among social networks.
    4. Create a multi-stakeholder forum to develop and promote pro-social policies for tech providers.

The Crisis of Trust and Technology

Since its first commercial use nearly three decades ago, the internet has become a fundamental part of American society. Banking, education, work, travel, entertainment and personal relationships have all been touched or even transformed by the internet and internet-enabled technologies.

The internet has brought many benefits. It vastly expanded people’s access to information, spurred the development of an array of innovative new services, connected diverse populations worldwide, empowered citizens to report on and debate events in close to real time, and enabled new forms of civic engagement.

However, the problems featured in headlines of 2017 and 2018—disinformation and hate speech, harassment and trolling, data breaches, foreign propaganda and Russian manipulation—have raised serious concerns about the larger implications of the online ecosystem on our democracy. A key focus of concern is the role of internet platforms (as explained in Chapter 3), particularly social media platforms, but the Commission’s concerns extend to the entire media ecosystem.

In particular, we focus on major technology companies and social media platforms that are used for the discovery, dissemination and amplification of news and civic information. Whether primarily intended as a news source or inadvertently turned into one, they have become important conduits between producers of news and online users.

There is no question that the internet and platforms that operate on it have deeply disrupted existing media.

For most of American history, news media—first in print, later in radio and television—had a direct, one-way connection to their audiences. Today, however, a large and growing portion of the population, especially youth, gets its news online. In 2017, 43 percent of Americans reported that “they often got news online,” just below the percentage who often get news from television (50 percent) and far surpassing those who often get news from radio (25 percent) or from print newspapers (18 percent).

In contrast to its print or broadcast predecessors, online news is available instantaneously to all users in greater volume, from more sources. It presents new opportunities to engage with the content through sharing and commenting. The internet has, in effect, put a printing press—and more—in the hands of every user, thereby vastly expanding free speech. And on social media platforms, news is often part of a “feed” that mingles traditional reporting with commentary from users. This complicates the question of “what is news” by blurring the line between producers and consumers of news. Users of media are now “prosumers,” as the futurist Alvin Toffler predicted 40 years ago.i

The Commission’s primary concern in this chapter is with how online platforms and services may be eroding trust in media and democracy. Of particular concern is the spread of disinformation and the loss of trust emanating from misuse of information. Additional concerns include confusion over distinctions among different sources of news or between fact-based reporting and the expression of opinions about news events. And we are concerned about the role of filter bubbles and echo chambers that can exacerbate political polarization.

Social Media and Democracy

While recommending steps to address these problems, the Commission also understands that social media platforms and other internet-based capabilities, whatever their faults, can continue to make an important contribution to society. Ethan Zuckerman of MIT, an adviser to the Commission, describes seven things social media can do to strengthen democracy.

Social media can

  • Inform us
  • Amplify important voices and issues
  • Be a tool for connection and solidarity
  • Be a space for mobilization
  • Be a space for deliberation and debate
  • Be a tool for showing us a diversity of views and perspectives
  • Be a model for democratically governed spaces

Realizing these goals will require a reimagining of how online intermediaries may better align with core democratic values. The Commission encourages all participants in the media ecosystem, and particularly providers and distributors of news, to identify which pro- democratic values they are pursuingii and to develop metrics that allow them (and the public) to track their success in living up to those values.

Freedom vs. responsibility. Pro-democratic values necessarily include free and open expression, a basic tenet of the First Amendment. Every generation faces the problem of applying the underlying principles of the First Amendment to new technologies. And while these freedoms protect speakers, they do not absolve them from moral, if not legal, responsibilities. Thus, the Hutchins Commission in 1949 urged press leaders to act responsibly before governments felt the need to regulate them. This theme of freedom versus responsibility— doing the right thing—plays an important role in our consideration of the ways to increase trust in the entire media ecosystem.

The Commission recognizes that the ways in which online media operate, and the nature of their impact on society, are directly related to the incentives that drive their behavior. Social media platforms, for example, have financial incentives that tie advertising revenues to the amount of time that users spend with a site’s content, i.e., time that eyeballs are potentially attuned to advertisers. This leads to designing online sites in ways that encourage users to share content with others, including provocative misinformation. As Facebook CEO Mark Zuckerberg has put it:

One of the biggest issues social networks face is that, when left unchecked, people will engage disproportionately with more sensationalist and provocative content. This is not a new phenomenon. It is widespread on cable news today and has been a staple of tabloids for more than a century. At scale it can undermine the quality of public discourse and lead to polarization. In our case, it can also degrade the quality of our services.

As noted earlier, in the current media environment, content that is provocative and divisive— even inaccurate—often spreads fastest and farthest. Given this backdrop, the Commission seeks strategies that would instead reward the dissemination of accurate, pro-social content and thwart propaganda.

Section 230 of the Communications Decency Act. In the United States, Section 230 of the Communications Decency Act (CDA) has played an important role in allowing internet-based services that host the posting and sharing of user-generated content the freedom to grow. This legal provision, enacted in 1996, allows “interactive computer services” to determine which content provided by others to include or exclude from their services without incurring liability for exercising editorial discretion.iii

Given the enormous growth of the internet over the past 20 years, and the emergence of large entities within it, this is no longer an infant industry, nor does it need legislative impetus for growth. And there is a much greater awareness today of the potential to use the internet to spread disinformation.

Congress has passed legislation that removes full Section 230 protections for content that supports sex trafficking, and has never offered protection against claims for the unauthorized use of copyrighted material. In addition, there have been proposals in Europe for new limits on online content. For example, the European Commission is considering a requirement that sites take down terrorist content within one hour of being notified by authorities, while France has passed new restrictions on hoaxes and fake news online.

There will undoubtedly be further debate in the U.S. about the proper application of Section 230. U.S. Senator Mark Warner, a Virginia Democrat, has issued a white paper identifying many potential ways to regulate online services, including amending Section 230 to remove its protections with respect to illegal or tortious expression.iv The Commission has heard arguments for imposing more liability on online services to prevent defamatory utterances, disinformation or otherwise-actionable material that threatens individuals or the democracy itself. And it understands that there can be difficulties in enforcing the few rules that users do have at their disposal.

But the Commission also recognizes that without this protection, social media platforms and other internet-based services would likely have incentives to block lawful speech too aggressively because it might incur liability. And they have already seen allegations of political bias in performing their editorial roles.

Internet platforms and social media sites are at once the curators, moderators and transmitters of information. Because of this complex role, solutions to future threats to American democracy will not come easily.

During the period of the Commission’s deliberation, we have seen the major online services acting more forcefully against harmful speech via enforcement of their terms and conditions. In several instances, however, these actions have led to protests that they were insufficient, biased, overly broad or unjustified.

The Commission does not take a position on amending Section 230, as more time and reflection are needed. Given the tension of values that it involves, any consideration of changing this provision should be done deliberately, focusing directly, specifically and narrowly on the speech involved and the potential consequences either way.

The values of free expression, of an open internet free to evolve, of responsibility to users and to the democracy, and of inclusion, must remain guiding principles as governments and private companies adapt to and address the internet of today.

Our Challenge

As the Commission developed its recommendations, some of the most challenging questions it considered include:

  • What function or purpose do, or should, social media platforms have in our lives?
  • Who decides who, or what information, is trustworthy?
  • Who is to blame for the lack of trust?
  • Do social media or other institutions have a responsibility to work against political or social polarization? How?
  • How can platforms inspire users/citizens/consumers to place trust in the news and information they receive, and encourage them to engage in meaningful civic discourse?

Values to consider. To address the unique challenges posed by the internet ecosystem and the erosion of trust in our nation’s institutions (including the Fourth Estate), the Commission adopted several values that informed its considerations:

A platform-agnostic (even a technology-agnostic) approach. While certain platforms may have been in the spotlight during 2017 and 2018—namely Facebook, Twitter, Google and YouTube— recommendations for the future of the internet must address the whole scope of services and sites that users engage with online. Recommendations must take into consideration their implications for services of different sizes, with different resources, and serving different communities.

A recognition of the continuing evolution of technology. The internet of today is not the internet of the past, nor will it be the internet of 5, 10 or 20 years from now. Just as it has changed dramatically since its inception, the internet will continue to evolve in unforeseen ways over the coming years.

A commitment to retain the principles of openness, inclusion and free expression. Given the unpredictable future of the internet, users need a set of core values that can guide efforts to shape constructive online experiences. For example, those attempting to combat mis-, dis- and malinformationv cannot lose sight of the values that make a free and open internet possible, namely a commitment to freedom of expression, in the U.S. and abroad.

Responsibility to the broader society. Continuing the theme of responsibility throughout this report, each stakeholder should realize its responsibility to the broader society and to individuals.

Download the full chapter here.


i Alvin Toffler, The Third Wave (New York: Morrow, 1980).
ii In response to criticism related to the negative impact of social media, the leading social media platforms have begun to articulate their mission in larger terms. For example, during testimony to the Senate Intelligence Committee, Twitter CEO Jack Dorsey stated that the “purpose of Twitter is to serve the public conversation” and that it is committed to improving the service by encouraging “more healthy debate, conversations, and critical thinking on the platform,” as well as by seeking to eliminate “abuse, automation, and manipulation.” Foreign Influence Operations’ Use of Social Media Platforms, Hearing before the Senate Select Committee on Intelligence, 115th Cong. (2018) (statement of Jack Dorsey, Chief Executive Officer, Twitter), https://www.intelligence.senate.gov/sites/default/files/documents/os-jdorsey-090518.pdf.
iii Specifically, Section 230 states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Protection for Private Blocking and Screening of Offensive Material, 47 U.S. Code § 230 (2011), https://www.gpo.gov/fdsys/granule/USCODE-2011-title47/USCODE-2011-title47-chap5-subchapII-partI-sec230. For a description of the legislation to stop sex trafficking on the Internet, and an argument against it, see Aja Romano, “A New Law Intended to Curb Sex Trafficking Threatens the Future of the Internet as We Know It,” Vox, April 18, 2018, https://www.vox.com/culture/2018/4/13/17172762/fosta-sesta-backpage-230-internet-freedom.
iv Mark R. Warner, “Potential Policy Proposals for Regulation of Social Media and Technology Firms” (DRAFT, 2018).
v As defined by Wardle and Derakhshan, Information Disorder, 5:
  • Misinformation, when false information is shared, but no harm is meant.
  • Disinformation, when false information is knowingly shared to cause harm.
  • Malinformation, when genuine information is shared to cause harm, often by moving information designed to stay private into the public sphere.
 
 
Title Goes Here
Close [X]