page image

CHAPTER IV - Viruses of Ideology

Viruses of Ideology
In his 1976 book, The Selfish Gene, Richard Dawkins coined the term “meme” which he defined as “a self-reproducing and propagating information structure analogous to a gene in biology.” Just as a gene carries instructions that determine a physical characteristic that is passed down through succeeding generations, a meme encodes an idea or other information that can be transmitted from one person to other people.

Of course, memes are not new. In fact, they have been around as long as humans have had the ability to communicate with one another, which made it possible to capture and transmit knowledge across generations. Instructions for how to make a flint cutting tool or how to grow wheat provided sparks that helped energize the evolution of human civilization, while memes like the Garden of Eden, Pax Romana, the crucifix, the divine right of kings all played profound roles in shaping our civilization.

What is new is the emergence of modern electronic media that make it possible to propagate memes far faster and wider than had previously been possible. As Nova Spivack, Founder and Chief Executive Officer at Magical, noted “the growth of the Web, social media, texting, and the adoption of smart phones, the ease with which anyone can create and spread memes, and the potential audiences they can reach, have radically increased.” As awareness of the potential of memes increased, the goal of meme makers became to have them “go viral”—to replicate rapidly by being shared multiple times. Many memes took the form of a simple image along with a bit of text that was attention-getting and easy to share. The use of hashtags on Twitter also became an effective way to encourage the wide sharing of an idea.

The war on trust. This was all largely harmless fun until memes became coopted as weapons as part of what Nova Spivack described as a “war on trust”—the deliberate use of “military-grade information warfare and psyops” by both governments and non-state actors, aimed at civilian populations “to overwhelm and ultimately degrade, societal faith in institutions, democracy, the free press, science, leaders and in each other.”

The openness, pervasiveness and relative anonymity offered by the Internet has made it a perfect medium for carrying on this type of warfare, and the self-replicating power of memes has made them an attractive weapon for spreading doubt and disinformation. Ironically, open societies with their traditions of free speech and democratic dialog, provide the ideal “petri dish” for memes to spread with little or no government interference. The ultimate goal of these cyber-attacks, according to Spivack, is to “disarm the societal immune system. . .by degrading faith in institutions, democracy, the free press, science, leaders, the rule of law and in each other.”

The purveyors of disinformation have an inherent advantage over the defenders of fact: While the accuracy of a true story typically gets established within two hours, it can take up to 14 hours before a false rumor gets thoroughly fact checked and discredited, leaving considerable time for it to circulate and have a wide impact. And the generation of fake news now seems to be a predictable response to every big event. In the immediate wake of the mass shooting in Las Vegas in October 2017, a spate of “hoaxes, completely unverified rumors, failed witch hunts, and blatant falsehoods spread across the Internet,” some of which got amplified by the algorithms used by social media to promote stories “trending” with users.

Although cyberattacks are not new, the reality of the cyber-based war on trust attained new prominence in 2017 as Russia and possibly others attempted to interfere in elections in the U.S. and other democracies using “fake news,” e-mail hacks, and attacks on voting systems.

In the wake of ongoing investigations into the nature of this interference, the deliberate use of social media platforms such as Facebook and Twitter to spread disinformation has come to light. And even as ISIS has suffered from a series of setbacks on the battlefield, it has continued to use social media to maintain its “global influence.”

The science of memes. Following Richard Dawkins’ identification of memes in 1976 as a cultural phenomenon, research on the nature and function of memes was relatively sparse, limited primarily to a handful of academics. But as memes became “weaponized,” more attention has been given to their potential and how they can be effectively countered. After 2001, as part of a broader effort to respond to a “war of ideas” being waged by terrorists, the U.S. military began paying attention to the importance of memes. In a 2006 memo, then-Secretary of Defense Donald Rumsfeld noted that “[al-Qaeda leader Ayman] Zawahiri has said that 50 percent of the current struggle is taking place in the arena of public information.”

Also in 2006, Michael Prosser published a master’s thesis at the Marine Corps University School of Advanced Warfighting titled “Memetics: A Growth Industry in U.S. Military Operations” that proposed the creation of a Meme Warfare Center as part of the military. That same year, the Defense Advanced Research Projects Agency (DARPA) commissioned a multi-year study of “military memetics” conducted by Dr. Robert Finkelstein, head of Robotic Technology, Inc., who proposed to develop a general theory of memetics that would yield “testable predictions and falsifiable hypotheses” about how memes work. Finkelstein defined memes as “information which propagates, has impact and persists” and explained that a meme can be as small as a single phrase or image or as large and complex as an unabridged dictionary or the instructions for building a nuclear submarine. He also identified other metrics to describe how memes spread and persist and the impact they can have .

Nova Spivack has also been thinking about the need to develop a more scientific approach to the study of how ideas propagate. In a 2004 paper on “A Physics of Ideas: Measuring the Physical Properties of Memes,” he wrote that:

Ideas are perhaps the single most powerful hidden forces shaping our lives and our world. Human events are really just the results of the complex interactions of myriad ideas across time, space and human minds. To the extent that we can measure ideas as they form and interact, we can gain a deeper understanding of the underlying dynamics of our organizations, markets, communities, nations, and even of ourselves. But the problem is, we are still remarkably primitive when it comes to measuring ideas. We simply don’t have the tools yet and so this layer of our world still remains hidden from us.

Infodemiology. To help us to better understand and respond more effectively to hostile memes, Spivack called for the development of a new field of “infodemiology,” which has been defined as “the science of distribution and determinants of information in an electronic medium, specifically the Internet, or in a population, with the ultimate aim to inform public health and public policy.” Drawing on concepts and practices from epidemiology to inform strategies for memetic warfare, this new field is adapting public health techniques developed to combat infectious diseases. For example, outbreaks of disease can be countered by “ring immunization,” a process that focuses on immunizing all contacts around an infected individual (the technique used in combatting Ebola), while “broad immunization” attempts to eradicate a disease in advance by immunizing an entire population (the technique that was used to eliminate polio).

To provide immunity to memes, one technique could be to develop the information equivalent of T-Cells, the physical factors in the body (white cells) that work together to recognize, hunt and kill dangerous foreign bodies. Following this analogy, a society under attack from “offensive memes” would design “defensive memes” that would function as “social T-Cells” that would help a population to develop resistance to harmful memes.

Combatting ISIS. As noted earlier, ISIS has invested heavily in using social media to conduct information warfare that reaches well beyond the physical limits of its territory. From the perspective of infodemiology, ISIS is “meme-plex” that can be seen as a public health problem that behaves “like a disease…a mental health disorder…that is spreading virally through at–risk populations.” Specifically, we need to think of ISIS’ ideology as an infectious virus and fight it as we would fight a virus.

One key characteristic of any infectious disease is its degree of contagiousness, also known as its “viral coefficient” (R0) or the number of people that one sick person will infect on average. Viral diseases like hepatitis and Ebola have a fairly low R0 of 2, while measles, with an R0 of 18, is much more contagious. Fortunately, it appears that the R0 for ISIS is fairly low: it is not highly transmissible and the fatality rate, to date, has also been relatively low. Unlike Ebola, which spread rapidly and killed many of those who contracted it, ISIS has not been able to achieve sustained exponential growth.

In fact, it is possible the disease that ISIS most closely resembles in terms of its contagiousness is leprosy, which is a serious disease but is actually quite hard to transmit: infection requires repeated intimate contact and mainly spreads through close family bonds and people who live in close proximity in long-term relationships. Similarly, ISIS infections are transmitted through intimate relationships between a guru and his acolytes.

To find effective means to prevent the spread of an ideology like ISIS, we can look to past successful campaigns to combat infectious diseases. In particular, it makes sense to focus efforts on “susceptibles,” people who are likely to be vulnerable to the appeal of ISIS but have not yet been infected, with a goal of building up their immunity to ISIS before they are exposed to it. It is harder to deal with those who have already been infected; the primary goal with this group should be to prevent them from transmitting the infection to others. An effective infodemiology campaign needs to make use of a spectrum of strategies drawn from epidemiology, ranging from monitoring and early detection through inoculation and treatment.

image 1

Memetic Warfare. The basic definition of memetic warfare is the use of memes to combat other memes. But one practical challenge for nations that wish to engage in this kind of struggle is that memes seem to be a more effective weapon for insurgencies than for governments: as one participant put it, memes appear to function like the IEDs of information warfare and are subversive by nature. They do a good job of blowing things up, increasing disorder within a system, but they do not seem to be good tools for building stability.

The challenge of this asymmetry was clearly demonstrated by the 2013 attempt by the U.S. Department of State to combat ISIS propaganda with its own Twitter campaign titled “Think Again Turn Away.” According to one critic, the initiative, which attempted to explicitly use the credibility of the agency, was “not only ineffective, but provided jihadists with a stage to voice their arguments.” A similar kind of blowback happened in 2014 when the New York City Police Department attempted to launch a meme in the form of a Twitter hashtag, #mynypd, that was intended to encourage residents to share positive stories about the police. However, the hashtag was quickly co-opted by critics who used it to share images of alleged police brutality perpetrated by the department.

A more effective strategy for inoculating vulnerable populations is not trying to mimic insurgents’ strategies, but to intervene with them in ways that are authentic through existing “networks of trust.” Thus, the most powerful memes come from the grass roots and need to evolve freely. The most effective (though potentially risky) initiatives give a voice to individuals who have a message that is consistent with the one that a government or an institution wishes to communicate. Marc Nathanson cited the example of a campaign to increase the college application rate among inner city high school students with low participation rates in higher education. Rather than just offering information or media messages, the campaign identified students who were leaders in their schools and trained them to encourage other kids in the schools to apply for college and financial aid. The result was an increase in the rate of college attendance from under 10 percent of graduates to more than 30 percent.

Felipe Estefan from the Omidyar Network noted that participation in an online network is not automatically “empowering.” Being online can make people more vulnerable to disinformation and can isolate them from their immediate communities, limiting their ability to participate in civic actions. In reality, most online networks are being driven by commercial considerations and are not necessarily serving the public interest.

How, then, can individuals be truly empowered? People generally hate being told that they are wrong, which is what often happens when people are encouraged to fact-check information they encounter online. A better strategy is to give people the tools to do their own fact-checking. In Argentina, the Omidyar Network funded Chequeado, an organization that created an app to make fact-checking easier to do, and a trivia game called Chequeate that tests an individual’s knowledge.

According to Monika Bickert, Facebook established several initiatives designed to combat disinformation by “stopping bad stuff, promoting responsible stuff, and helping people to tell the difference between the two.” For instance, Facebook looks at “back end signals” of postings to find ways of detecting fake accounts that are spreading disinformation. She noted that during the French election the company took down some 30,000 accounts. (The company also acknowledged that it had posted several thousand misleading political ads aimed at U.S. citizens, some of which were paid for in rubles, that were not identified until well after the election.)

In one of their efforts to promote responsible content, the company sponsors the Facebook Journalism Project that is intended to help mainstream media understand and operate more effectively in a digital world. The project includes collaborative development of innovative news products, including new storytelling formats, and training and tools for journalists and for news consumers. In partnership with First Draft, a nonprofit coalition of news organizations whose mission is “to raise awareness and address challenges relating to trust and truth in the digital age,” Facebook is working on a project to educate bloggers on journalistic standards.

Facebook has also undertaken several initiatives to help individuals become more discerning consumers of information. Along with craigs-list founder Craig Newmark and other funders, it sponsors the News Integrity Project at the City University of New York. It has also experimented with allowing users to flag stories they believe are dubious or untrustworthy. If an article attracts a certain number of flags, it is sent to a third party for fact checking and, if found to be questionable, can be tagged by Facebook as “disputed news,” a tag users can see so they can have more context for assessing the story.

Facebook and other major platform providers still have considerable work to do to restore confidence in their credibility and must do so in the face of threats that continue to evolve. Perhaps the most important step these enterprises can take to earn the trust of users is to be fully transparent about what they are doing.

Building the Brand. While the Internet was originally seen as a powerful tool to promote openness and expand political engagement, this sanguine view is now challenged by a rising tide of cyberattacks, fake news and anti-social messages. The overall impact, according to Esther Dyson, is that our collective capacity to trust is being eroded, which makes every attempt to build a consensus less effective.

The Internet has emerged as a critical battlefield and memes have been mobilized as powerful weapons that can sow distrust. But they do not operate in a vacuum. As several ADDTech participants noted, real world actions still matter in the real world. According to a 2017 Pew survey of America’s global image (based on a compilation of ratings in 37 different countries), the percentage of individuals with a favorable view of the United States has fallen from 64 percent to 49 percent, while those with an unfavorable view increased from 26 percent to 39 percent.

What will it take to reverse this trend? Aspen Institute Communications and Society Program Executive Director Charlie Firestone noted that the concept of the American Dream is a kind of “super-meme” based on the promises of economic opportunity and political freedom. The appeal of America, embodied through cultural icons such as Levis, Elvis, MTV and Disneyland, played an important role in winning the Cold War. Felipe Estefan recalled that when he was growing up in Columbia, he listened to American pop music, watched American movies, and was impressed by American success stories. He developed a deep love for the U.S. and wanted to come here. Eventually, he did come and earned a degree in Public Diplomacy from Syracuse University. His story is a good example of the power of pop culture in communicating the appeal of America.

But as Jerry Green, President and CEO of the Pacific Council on International Policy, pointed out, the global decline in the image of the U.S. is ultimately based on its policies and actions. Brand America, which used to stand for great things, now seems to be based on what it is against. Karen Kornbluh wrapped up the discussion by arguing that “we can’t depend on the social networks” to support democracy. We need to work collectively to restore the appeal of our own brand.

Share On