Two years ago, the Hewlett Foundation’s U.S. Democracy Program (then, the Madison Initiative) launched a grantmaking strategy to combat digital disinformation. The premise was that disinformation posed a unique challenge to U.S. democracy, and we should move quickly to learn more about the impact of this problem and to assess solutions to counter it. Because this is such a quickly changing field, we planned to take stock of our learning with an evaluation two years in and promised to then revisit how we might approach this work. That time is now here.
Designed in the wake of the 2016 election, our original strategy was to support research on the impact of digital disinformation, and to strengthen a nascent field by building a community of researchers, practitioners, and funders working on this issue. To date, the Hewlett team has allocated $10 million in grants to universities and policy and research organizations for activities including research, field-building, and outreach.
In keeping with Hewlett’s commitment to learning and evaluation, we commissioned an independent, external evaluation of the portfolio in April 2020. The full report can be found here. The report outlines a range of findings that reflect on the success of the strategy and gives us significant fodder as we think about our next steps.
The evaluation, conducted by Drs. Jodi Nelson and Amy O’Hara, details a great deal of learning over the last three years. In part as a result of our investments, there are now collaborative networks among and between grantees and funders, and our knowledge of the problem of disinformation has advanced considerably. Here are three key takeaways that we drew from the independent evaluation that are guiding us as we chart the course forward.
1. It’s time to turn to solutions.
Though research was a key part of the groundwork back in 2016, we can no longer wait until we “know enough” to formulate robust evidence-based solutions to the problem of digital disinformation. The purveyors of disinformation are a constantly evolving adversary: just as we start to understand a specific trend, new and different threats pop up. And the threats are becoming both more numerous and more adversarial every month. Hewlett will still support some research in the field, but it’s clear that we can’t wait to reach a certain level of clarity before we start investing more heavily in solutions.
2. Disinformation is a sprawling problem. We should focus based on our strengths and expertise.
Digital disinformation now reaches many areas of democracy and civic life, including education, cybersecurity, media, technology and design, public health, foreign policy, corporate social responsibility, and many more. We cannot have an impact in each of these areas, and many are far beyond our influence. Recognizing this scale and complexity, we need to narrow our focus and design a strategy that complements and benefits from the program’s core strengths and other areas of work.
3. Disinformation is often designed to marginalize specific communities and discourage their participation. Any solutions we support must address that foundational characteristic.
What if we thought about disinformation not as a separate phenomenon or field of study, but in the context of its dangerous impacts on specific people, communities, and our democracy? How we study and build our knowledge of disinformation determines the parameters of what we learn. We need to reckon with the disproportionate impact of disinformation and inflammatory content on the communities it targets in our definition of the problem and in our exploration of solutions. Targeted communities vary, but Black Americans and women are at particular risk.
Having completed the evaluation phase of our strategy refresh process, we are now considering how we might refocus our work to combat digital disinformation based on these lessons. Our hope is that the next iteration of our disinformation grantmaking strategy will support solutions to some of the most urgent disinformation challenges our society faces.
Looking at the first two lessons, it makes sense to focus on issues that relate directly to U.S. democracy. One direction we are considering is where we see instances of disinformation and other digital threats undermining public trust in elections. Online disinformation, cyberattacks, hack-and-leak operations, and other digital threats are major threats in this area. Election-related disinformation is often designed to favor a particular candidate by discouraging likely opponents from voting—a disingenuous tactic that often results in race-specific targeting. This was the case in 2016, for example, with fake accounts claiming to be Black American activists discouraging Black Americans from participating in the presidential election that year.
We are learning through this evaluation and other work that it is often necessary to design solutions based on the specific ways that different communities are targeted. Some of our grantees have explored the impact of COVID-19 misinformation on Black Americans. How might we learn more about community-focused research to better understand how to counter efforts to discourage different groups from voting?
There are many other ways in which emerging digital tactics threaten public trust in and, ultimately, the integrity of elections. We are exploring what it would look like to integrate our efforts to combat digital disinformation with our support for effective elections administration. This is still a broad area, and far too big to solve with our grant dollars alone, but it strikes us as a vitally important area of need that aligns well with our experience and other strengths. It also directly builds on the three lessons we’ve learned and outlined above, giving us an opportunity to work on a solution that would address the core vulnerabilities of democracy, which disinformation tries to attack. We think this is an area well worth exploring and certainly one with significant need. That said, the success of Hewlett’s strategies depends on the input of our many peers and partners in this work. We welcome feedback on this thinking and other areas to consider.