page image

CHAPTER IV - AI and Journalism

Just as systems of artificial intelligence are rapidly changing automobiles and healthcare, so it is transforming journalism and media, especially in online contexts. Alberto Ibargüen, President and Chief Executive Officer of the Knight Foundation, declared, “We’re at the very beginning of this process, which is fundamentally the same as what Europe experienced just after the arrival of the mechanized press. Before Gutenberg, monks would illuminate a few manuscripts a year and Church officials would give its imprimatur so all would know what was ‘truth.’ After Gutenberg and for 100 years, no one could figure out what was true or whom to trust because anyone could publish a book or pamphlet. I find that analogy accurate, comforting and hopeful.”

It helps to understand the basic forces that are roiling the news business and media today. As Charlie Firestone of the Aspen Institute explained, “The news business, like many others, has been disintermediated and decimated by the digital revolution.” Technologies have provided countless new choices for people, who often had limited access to information. But there is also a huge fragmentation of information sources now competing with conventional journalism for people’s attention. The non-journalistic competitors include Facebook, LinkedIn, Snapchat and Twitter as well as countless specialized websites and blogs that arguably have closer, more credible connections with a field than do general-audience newspapers and broadcasts.

“Just as the unit of commerce in music went from the album or CD to the individual song,” said Firestone, “so the unit of commerce in the news has gone from the publication or broadcast outlet, to the story.” That has been very useful for search engines such as Google, which have been disaggregating the functions of journalism, he said. Today, different digital entities are taking these different parts of newspapers and recombining them. Curated websites are ascendant, but much of the information on them is based on “free” content generated by someone else.

These dynamics are undermining the formerly dominant business models of media organizations, privileging commercial viability at more granular levels (individual stories, user clicks) and eviscerating journalism as a profession. Web-based sources of news and information are eclipsing traditional journalism even as they feed on it. One conference participant suggested that Facebook arguably has more impact on public conversation these days than the New York Times in the sense that its AI-driven newsfeeds reach hundreds of millions of people and are widely shared. Local news is particularly vulnerable, noted many participants, because it is difficult to monetize super-local content, even with the help of AI.

Contemporary journalism faces a financial crunch: News organizations are not getting paid by the ad-based, entertainment-oriented content curators who depend upon journalists’ original news stories, which require costly reporting, fact-checking and investigations. “Suddenly, it is very hard to get people to pay for their broccoli,” said Michael Ferro, referring to the substantive, well-reported journalism that any democratic society needs. “People don’t even want to pay for their corn! Let’s just say that people are getting their corn for free.”

Ferro believes that it is a democratic imperative to figure out how serious journalism can be economically viable in today’s digital environment. It may even be necessary for government or ad-based models of free content to subsidize journalism, he said. This, after all, is an approach that some European nations have taken as a means to support newspapers and book publishing, and thus a foster better-informed public. It is also the rationale for the U.S. Government’s long-standing postal subsidies for newspapers and printed matter. However, some participants strongly disagreed with subsidies for newspapers, which they felt would only result only in bad newspapers.

In any case, Ferro, as Chairman of tronc, the company that owns the Los Angeles Times and Chicago Tribune, is determined to find new business models to support serious journalism as a vital institution in American democracy. The news media are indispensable sources of civic information for citizens and a force for oversight and accountability of those in power. For now, said Ferro, journalism is surviving today chiefly through revenue from its print operations. While online news startups are getting a lot of attention, he said, “No one can make money once they have to scale and build an infrastructure.”

Artificial Intelligence Enters the House of Journalism
Artificial intelligence appears to have both positive and negative impacts on journalism. It offers tools that enable journalists to perform their jobs more efficiently and generate new insights from data search-and-analysis. But AI also is a powerful tool for content personalization, which tends to disaggregate existing news products (newspapers, broadcasts) and undercut traditional business models for journalism. AI is also a tool for mischief and misinformation spread by automated bots.

To be sure, the personalization of news is in many respects a gain for readers. “AI seeks to learn what its users want and how they want it,” writes Francesco Marconi of the Associated Press. “In the specific case of news media, articles can be processed through algorithms that analyze readers’ locations, social media posts and other publicly available data. They can then be served content tailored to their personality, mood and social economic status, among other things.” This capacity has enabled the Weather Channel to customize some of its content and so improve advertising CPMs [the price charged for 1,000 user impressions of one webpage],” said David Kenny of IBM, who was previously Chairman and CEO of The Weather Company, owner of the Weather Channel.

While filtering of news may make it more relevant to individual readers (generating more clicks and profits in the process), it can degrade the quality of journalism indirectly. Filtering tends to exclude diverse points of view and marginalize serious journalism and complex analysis. The headline from the mock-news website The Onion puts it nicely: “Horrible Facebook Algorithm Accident Results in Exposure to New Ideas.” In a click-driven environment, it has become harder for reputable news organizations to commercially justify the “broccoli” that they have traditionally folded into their content mix.

There is another downside that AI can inflict on journalism — bots. Bots on open networks are often used to dilute the agenda-setting powers of traditional news media by building echo chambers of their own pseudo-news, misinformation and skewed perspectives. Narrowly focused political or commercial actors with no commitment to journalism or public service frequently use bots to spread propaganda or marketing disguised as news, leaving the public confused about what information is accurate, trustworthy and properly contextualized. “Bots are basically being used to ‘weaponize’ AI,” said Mark Riedl, Associate Professor at Georgia Tech. “They just repeat the same misinformation over and over and over again. The human bias is to believe the things that they hear, more often than not.”

Lili Cheng, General Manager at Microsoft, described how Microsoft released a “chat-bot” called “Tay” on Twitter in May 2016 after successfully testing it in smaller social networks in China and Japan. The bot was designed to simulate the conversational personality of a teenage girl. To the surprise of Microsoft designers, Tay in a U.S. context attracted a wide variety of hateful social media users who posted vile racist and anti-Semitic comments, which in turn triggered Tay to automatically repeat such phrases and scrape material from hate websites. Microsoft quickly suspended use of the bot.

Joi Ito believes that “the architecture of the Internet may contribute to the cesspool of trolls online. Their anti-social behaviors may be an emergent property of the way that comments sections are organized.” Ito speculated that perhaps an architectural change could dampen the emergence of uninformed mobs and amplify more constructive participation.

“If we had an AI system that could go through and remove even 10 or 20 percent of the most egregious hate speech, it might have a pervasive impact on how people put their thoughts into the world,” said Astro Teller. Mustafa Suleyman of DeepMind reported that his firm is actually working on a “respect engine,” an AI system for that very purpose. Another participant wondered if AI could be used to identify and elevate great comments and multiple perspectives.

All this speculation about possible uses of AI systems prompted Suleyman to emphasize an important point: “We should not talk about AI as if it had its own autonomy and agency independent of us. We are the ones who decide when to deploy AI systems, and for how long and in what context. We have to stop anthropomorphizing these systems.”

Suleyman’s warning highlights a key concern: Who will control the automated curation and exclusion of certain information via AI systems? Perhaps they could be used to fight trolls, but what happens if government wanted to use the same tools to censor or marginalize ideas that it dislikes? The U.S. Government has already approached Silicon Valley companies for their help in fighting ISIS propaganda websites. Is it possible that governments might use AI systems to try to manipulate public opinion?

These are some of the alarming possible uses of AI in news media. But there are also many benign, information-synthesizing tools that essentially convert raw data into natural language. In an article on how AI startups are reinventing media, Caleb Garling writes: “Companies like Automated Insights and Narrative Science are powering production of millions of auto-generated ‘articles,’ such as personalized recaps for fantasy sports fans. A similar metrics-based formula can be used to recap a customer’s stock portfolio performance.” A company called Arria is using AI to analyze complex data sets in numerous fields, such as finance and meteorology, and then produce expert reports — a process that once required human analysts. The Associated Press announced in 2015 that it would use AI software to write company earnings reports.

AI-driven analytics are also being used to spot news trends that human editors might not recognize. For example, an app called Banjo can pore through digital feeds from Twitter, Facebook and elsewhere on the Web to identify “important” (popular) news stories faster than a human editor might.

Several participants fantasized about having an AI fact-checker, but the AI technologists in the room cautioned that that is an “AI hard problem” not easily solved. The more significant barrier to such fact-checking, warned David Kenny, is not technical, but psychological: “People don’t want to hear that what they like to read is wrong.”

The question posed at the outset of the conference — Should AI aspire to replace human beings or augment them? — remains a central issue in journalism and news media. AI may be able to take on many tasks historically performed by reporters and editors, participants agreed, yet there is consensus that journalism ultimately remains a creative human activity requiring judgment and originality.

Share On