“The ‘Big Lie’ That Fanned the Flames.” Philanthropy’s Role in Fighting Misinformation

lev radin/shutterstock

lev radin/shutterstock

In the aftermath of the 2016 election, civic-minded funders ramped up efforts to combat the proliferation of online misinformation propagated by an array of bad actors and conspiracy theorists, and facilitated by untouchable tech giants.

In many cases, their efforts paid off, despite formidable obstacles, said Joshua Stearns, director of the Democracy Fund’s Public Square Program. “We have seen that over and over again—through the Census, the election, the pandemic—where community organizers, researchers, advocates and educators organized around new networks helped suppress bad information and boost good information.”

Then came January 6.

Funders’ worst fears came to pass when Trump supporters, fueled by a steady flow of election-related misinformation, stormed the Capitol. “The ‘Big Lie’ that fanned the flames of extremism, which exploded at the Capitol Building, was that the presidential election was rigged and ‘stolen’ from the incumbent,” said Nicholas Charles, a spokesperson for the Save Journalism Project.

The event has forced funders to ask some difficult questions. Could philanthropy have done more to prevent it? What’s philanthropy’s role in tackling misinformation in a new year marked by vaccine rollouts, a second impeachment trial and the Biden presidency? What are funders’ biggest limitations in this space?

I reached out to funders and nonprofit leaders working in the areas of journalism and civic engagement for their thoughts. Vera Franz, deputy director at Open Society Foundations’ Information Program, summed up respondents’ sentiments best.

“We recognize that while it would be impossible to entirely stop the problem, we can mitigate against it,” Franz said. “And we are focused on doing so in a way that helps the communities that are most directly targeted by the spread of these lies while advancing a competitive, decentralized web.”

There is a lot to discuss on this topic, so I’ve sorted commentary into two parts. This first post will summarize areas where philanthropy has both succeeded and come up short, along with some of the obstacles they’ll be facing in the precarious year ahead. A follow-up article will lay out a funder “to-do list” as well as proposed legislative and regulatory reforms that may be worthy of philanthropic attention.

Pre-2021 successes

Respondents mentioned the following areas and initiatives where philanthropy, to quote Stearns, “helped suppress bad information and boost good information.”

Bolstering local news

Funders believe that a strong local news ecosystem can help boost civic engagement, instill community trust and offset the allure of misinformation. “The best defense is a good offense,” said Jim Friedlich, executive director and CEO of the Lenfest Institute for Journalism. “Fake news and misinformation fill voids where there is a lack of quality news and information.”

Friedlich cited the institute’s work in doubling the size of the investigative news team at the Philadelphia Inquirer and by building and launching Spotlight PA, a nonprofit newsroom covering state government in Pennsylvania.

Sarabeth Berman, CEO of the American Journalism Project, which launched in 2018, told me her organization has “been partnering with several community foundations to help them assess their communities’ information needs and craft business plans to incubate new nonprofit newsrooms that can fill the void left by the decline of commercial news and play a powerful counterweight to misinformation.” Berman noted that Vermont’s VT Digger and Mississippi Today have effectively dispelled misinformation about the COVID-19 vaccines in their respective regions.

Report for America co-founder Steven Waldman told me his organization hopes to put at least 1,000 local reporters in the field by 2024. “Report for America and our peers have benefited greatly from the leadership of the Knight Foundation, the MacArthur Foundation, Facebook, the Robert Wood Johnson Foundation, the Joyce Foundation and more, who see the many ripple effects good journalism can have at the local level and nationally,” he said.

Promoting news literacy

Rick Edmonds, media business analyst for the Poynter Institute, cited PolitiFact, the fact-checking site conceived 10 years ago when “lying and out-of-context talking points by politicians were less extreme. That work is still valuable,” he said.

Friedlich noted the Knight-Lenfest Local News Transformation Fund, which supported a project by the nonprofit AI for the People focused on reducing engagement with online misinformation targeted at Philadelphia’s Black community, and the Facebook Journalism Project, which has made more than 400 grants to local newsrooms covering the pandemic. Lenfest, which is a grant administrator for the project, used NewsGuard to ensure that grants flowed to legitimate newsrooms.

The Democracy Fund’s Stearns highlighted Election SOS and the Disinfo Defense League, groups that “were at the forefront of efforts to disrupt misinformation and combat racialized disinformation campaigns at the community level.” Moreover, “in places like Colorado, North Carolina and New Jersey, local journalism hubs organized statewide collaborative reporting partnerships that ensured quality information could flood the zone and reach the most people possible,” he said.

OSF’s Franz cited the Tow Center for Digital Journalism as an organization that works with newsrooms and reporters to help them “cover the news of the day without further disseminating disinformation.” Franz also highlighted international outfits committed to combating misinformation, like El Surtidor in Paraguay, Mutante in Colombia, and EU DisinfoLab, the Brussels-based OSF grantee that has “successfully researched and exposed several disinformation operations.” (Editor’s Note: IP’s Michael Kavate has reported that OSF is undergoing a “major transformation,” which includes the merger of its information and digital rights program with its project on journalism. Get the details here.)

Stearns, Franz and Paul Cheung, Knight’s director of journalism and technology innovation, all touted First Draft, a project founded in 2015 to fight mis- and disinformation online. Its funders include Craig Newmark Philanthropies, Democracy Fund, Facebook Journalism Project, the Knight Foundation and OSF.

“In just a year,” Cheung told me, “First Draft has trained over 3,000 journalists and built a dashboard that offers daily insights on what their investigative research team sees online.”

Funding R&D

In 2016, Knight launched an open call for ideas to combat misinformation and awarded grants to 20 organizations the following year. John Sands, Knight’s director of learning and impact, told me the foundation “ended up doubling down on a number of ideas that emerged around news literacy, computer-assisted fact-checking, and news gathering that’s deeply connected to communities.”

Three years later, Knight launched a $50 million initiative to better understand how technology is transforming our democracy. “We’re supporting universities and think tanks doing independent, cross-disciplinary research that can replace some of the conventional wisdom shaping the public dialogue and inform actionable solutions,” Sands said. (I spoke to Sands in greater detail about Knight’s work in this space last July.)

Sands also highlighted Knight Research Network scholars “who were called upon by legislative and regulatory bodies in the U.S. and abroad for expert testimony and advice,” as well as the work of the Election Integrity Partnership. Led by Knight grantees at the University of Washington and Stanford University, the partnership “was especially successful in taking academic research from leading scholars and relaying it in real-time to journalists, civil society groups and policymakers who were charged with reporting on and defending the integrity of our most fundamental democratic processes,” Sands said.

A pre-2021 post-mortem

While hindsight, of course, is 20/20, respondents mentioned a few areas where philanthropy could have been more responsive or proactive during the past few years.

“One of the most challenging limitations in the space is that misinformation has quickly become siloed as a standalone funding area,” said the Democracy Fund’s Stearns. “When, in fact, combating misinformation (and supporting local news, for that matter) ought to be a concern for every funder because of the broad effects it has on our society, democracy and every other issue that funders seek to address.”

In addition, Stearns said, Where I think foundations have failed to address misinformation has been in neglecting to listen to and support Black, Indigenous, and people of color, especially women who have been the target of disinformation, and have been raising alarms about it for a long time.”

Knight’s Cheung also picked up on this theme, telling me that philanthropy “didn’t scale up resources fast enough for communities of color and non-English speaking communities to combat the spread of misinformation.”

Neil Brown, president of the Poynter Institute, praised philanthropy for supporting R&D and innovation. “But the funders’ reluctance to fund ongoing operations, particularly among local or small enterprises, is frustrating in these times of exceptional economic strain,” he said. “The battle for sustainability is real.”

The limits of philanthropy

“There’s a need to recognize that the forces of misinformation are vastly better funded than philanthropy and journalism—and have no desire to stick to facts and the truth,” said the Lenfest Institute’s Friedlich.  Here’s a somewhat depressing summary of what philanthropy is up against.

Opaque and untouchable social media giants

“As with all philanthropy,” Poynter’s Brown told me, “the challenge now and ahead includes an ethical one: Platforms and others that fund important work are often insufficiently transparent about their own work, involvement, role in the spread of misinformation, or other agendas. Obviously, this is particularly true of the platforms.” 

The Save Journalism Project’s Charles had even sharper criticism for Silicon Valley. “Despite all of this money,” he told me, “local news is struggling with insufficient staff and resources—due in no small part to the monopolistic behavior of tech giants.”

The problem, Knight’s Cheung said, is that “most funders are not lobbyists or policymakers. One limitation is our lack of capacity to shape policies that can mitigate the spread of misinformation.” There simply are not many pathways to exert pressure on tech platform executives to change their policies, he says.

Tyranny of the algorithms

For years, funders have identified social media algorithms as the biggest and most dangerous proliferator of misinformation. These algorithms are designed to maximize what the Wall Street Journal’s Joanna Stern called “the reach of the incendiary—the attacks, the misinformation, the conspiracy theories” that push us “further into our own hyperpolarized filter bubbles.”

Will Facebook and Twitter act against their shareholders’ financial interests by detoxifying their platforms’ algorithms? Note that while Twitter and Facebook both eventually booted Trump, the two companies lost a combined $51 billion in market value since. Until social media companies change their algorithms to de-escalate public discourse, funders’ efforts to combat misinformation will only go so far.

“We do not yet have the visibility into platform operations that allows us to truly understand what is happening, who sees what, and to properly study the impact and reach of social media in particular,” OSF’s Franz said. “We need access to the platforms’ data and algorithms so we can understand and keep the platforms accountable.”

Tyranny of the profit motive

Social media giants aren’t the only media players catering to readers safely ensconced in their echo chambers. In an incisive piece titled “We Need a New Media System,” Matt Taibbi argues that “on all sides, we now lean into inflammatory caricatures, because the financial incentives encourage it.”

Fox News executives portray liberals as “terrorist traitors,” Taibbi writes, while their left-leaning counterparts, having refused to examine the “tensions they didn’t see coming in Trump’s America,” portray his supporters as one-dimensional “caricatures that tickled the urbane audiences” of channels like CNN and MSNBC. “You can’t sell hatred and seriously expect it to end.”

The whack-a-mole effect

While rioters planned the insurrection and promoted election-related misinformation on mainstream sites like YouTube and Facebook, many others operated on platforms like DLive, BitChute, Gab and 4chan. If users are banned on one platform, they simply migrate to another one.

“The fragmentation of not just media platforms but information ecosystems is something that’s extremely challenging to deal with,” said Sharon Kann, research director at Media Matters for America.

Technology

The Lenfest Institute’s Friedlich told me how H.F. “Gerry” Lenfest, who passed away in 2018, established the institute, in part, out of concern about misinformation. “On the internet, you don’t know what’s real and what’s not,” Lenfest said way back in 2015. “Before long, we won’t know what to believe.”

Six years later, Lenfest’s warning is more prescient than ever. Case in point: Nieman Lab’s Joshua Benton recently found that increasingly sophisticated “deepfake” videos can make people believe misinformation. “The good news,” Benton wrote, “is that deepfakes don’t seem to present more of a danger than other varieties of disinformation. The bad news is… that’s plenty bad enough.”

Human nature

Finally, some respondents acknowledged that some people are going to believe what they want to believe or search out misinformation that fits a predefined narrative no matter what funders do. “Do we still risk mostly talking to ourselves?” Poynter’s Edmonds asked. “In this polarized environment, I am afraid so.”

Poynter’s Brown encourages funders and journalists to “get past the numbers game (to a degree) and don’t think our journalism or competing content will stop people from lying, making stuff up or profiting off of fear or other aspects of human nature.”

Fighting misinformation, of course, involves limiting its spread and access, and disincentivizing its creation. “But funders,” Brown told me, “should also understand that the fight is about how to return power to the hands of the consumer to be able to judge for themselves what’s real and what’s not.”