Back to basics: What we’re learning about supporting the field of evidence-informed policymaking

Zenabu Abrahamanu (right) and Asetu Somana (left) sell bananas at Agbogbloshie Market in Accra, Ghana. Photo credit: Jonathan Torgovnik/Getty Images

The Hewlett Foundation’s Global Development and Population Program provides roughly $20 million a year in grants for evidence-informed policymaking. The goal of these grants is to ensure policymakers can access and use high-quality evidence to improve social and economic policies—and ultimately, improve lives. Program Officers Norma Altshuler and Sarah Lucas explain the basic idea behind this strategy, what they’ve learned over the past year about strengthening the evidence-informed policymaking field, their next steps, and questions they would like input on from you.

 

The Hewlett Foundation’s Evidence-Informed Policymaking (EIP) strategy is grounded in the belief that policymakers around the world should use the best available information to understand and address the needs of all people, and to develop and implement policies that improve their lives.

We are motivated by the fact that when government policies and programs fail to deliver outcomes for people, there are both financial and human costs. And frankly, the resources we and many others put into data collection and policy-oriented research are wasted if that data and analysis is not taken up by the policy community to make better decisions and ultimately improve conditions and outcomes for the people they serve.

As stewards of this strategy, we direct our attention and our grantmaking to increase governments’ use of evidence for decision-making at every stage of policymaking. We support the use of evidence to set priorities, to allocate budgets, and to design, implement, and evaluate programs in an ongoing, routine way.

With goals this ambitious and widespread, no single set of actors—be they think tanks, universities, data scientists, statisticians, evaluators, civil service trainers, or the media—can achieve them alone. No one organization or type of organization can fill all gaps in evidence or singlehandedly create the capacity and will to use it. None alone can increase incentives for policymakers to rely more heavily on evidence or improve how decision-making is structured within and among government bodies. Nor do we believe that the challenge of evidence use will ever be definitively “solved.”

By their nature, policymaking, budget decisions, and program implementation are complex processes influenced by shifting political winds and genuine constraints. For government use of evidence to become more expected, routine, and influential, there needs to be sustained interaction among government actors and the non-government institutions that provide support and accountability. Therefore, our grantmaking must not only respond to specific opportunities to increase evidence use in specific contexts today. It must also help create lasting institutions, networks, practices, and knowledge that provide a foundation for these ongoing interactions of support and accountability to strengthen over the long term. We think about this as strengthening the “field” that will endure long after our grants do.

Over the last 12 months we have made a concerted effort to understand where we can make the most valuable contribution to strengthening the evidence-informed policymaking field. We brought together some of the world’s leading evidence-informed policy thinkers to help generate big ideas. We sought broader feedback on these ideas, and from the Hewlett Foundation’s Grantee Perception Report. We launched a call for proposals to identify new African policy research partners and learn how they think about overcoming systemic barriers to evidence use. We also turned to network analysis of Twitter data to get a preliminary sense of how the field is structured and took advantage of global conferences to talk with as many partners as possible.

Throughout this period, we heard from a wide spectrum of researchers and “doers” from at least five continents: from small single-country organizations to large multi-national NGOs; from small private donors to large multi-laterals. From all of this, several common themes emerged.

What we learned about the field of evidence-informed policymaking

Common challenges but not yet common goals.

Many of the people we consulted cited similar challenges to getting the evidence they produce (or fund the production of) into use by government decisionmakers. High turn-over of government champions; performance or political incentives that don’t reward evidence use; entrenched bureaucratic systems that weren’t built for and are hard to adapt for routine consideration of evidence; budget shortfalls; low capacity to find and discern among evidence sources; and so on. Yet, most of the organizations generating data and evidence focus primarily on finding better ways to get their own evidence used in specific cases. For example, a health policy researcher might focus on trying to influence a specific program or policy at the Ministry of Health in a given country. Many fewer put themselves in the shoes of the ministry official who is on the receiving end of various scholars’ research; they tend not to have an explicit goal of changing the government capacities, incentives, and systems that hinder evidence use across the board. So, while many organizations in the EIP “field” are united in their challenges, many remain individual in their goals.

Many siloes, a few bridges.

Most individuals are deeply committed to the work they do, whether it is in the area of impact evaluation, data analytics, economic and social policy research, or open data. They value connections, shared goals, and professional advancement within these communities. The field mapping revealed various tightly-knit evidence communities that are not very well connected to each other. They coalesce around specific methods, sectors, or professional affiliations rather than around a common goal that evidence be used to improve government decision-making and outcomes for people. Yet the mapping and our own experience also suggest that some people and organizations have explicitly positioned themselves as bridgers and connectors. They share ideas, focus on higher-level common goals, promote the ideas of others even if they aren’t within their silo, and draw on range of evidence methods (from policy research to impact evaluation, from demographic surveys to satellite data) to advance their own work. This kind of bridging is essential because there is no one type of evidence that can address all policy questions.

Messaging, not hierarchy, matters.

In the spirit of fostering shared goals and bridging siloes, many of the people we consulted see the value of having a shared vocabulary and framework around evidence-informed policymaking. Some wanted to more formally conceptualize the field with a theory of change, defined outcomes and shared principles. However, others expressed concerns that such a structured effort would exacerbate exclusivity (who’s evidence is more important) and risk the reification of a hierarchy of types of knowledge (whether impact evaluations are the “gold standard”).

While formal conceptualization of the field may be too heavy-handed or counter-productive, many observed that better EIP messaging is essential. For example, many people have told us how powerful they found Ruth Levine’s speech about the moral case for using evidence. While no single set of messages will work everywhere, many saw value in creating a better narrative, possibly including a messaging framework (such as this messaging guide about ocean conservation) that could be tailored to resonate with policymakers and with citizens in various contexts.

Fascination with institutionalization.

Almost everyone we consulted is interested to learn how governments can systematically embed evidence into decision-making, and what outside organizations can do to help. Indeed, some already try to do this by helping government counterparts establish evidence-oriented government units or develop requirements to consult or use evidence, such as national evaluation policies. Yet there is clear appetite for more learning about existing government mechanisms, how they work, which work better for fostering learning and which for accountability, what role transparency and non-governmental actors play in their success, and so on.

Start with what you’ve got.  

The people we consulted highlighted the importance of inspiring political commitments to use evidence. However, there was little appetite for a new global evidence-oriented initiative or partnership. Some doubted that a call for better evidence could achieve high-level political buy-in, and others worried that even if governments made commitments, they would not deliver on them. Instead, people advised that an evidence-to-policy agenda be integrated in a more intentional way into existing global efforts, such as the Open Government Partnership and/or global health initiatives like GAVI, the Vaccine Alliance and the Global Fund to Fight AIDS, Tuberculosis, and Malaria. Likewise, we heard interest in focusing on sector or country opportunities to advance the use of evidence, and in using learning from these experiences to inform and inspire the field.

Getting the funders’ house in order.

One message came through loud and clear: what funders do affects how the field works. If we want actors in different evidence communities to break down siloes and work to foster more systemic evidence use by governments, we need a community of funders that thinks this way too. We need funders that not only care about and invest in research and data products, but also care about and invest in the institutions that produce them and have the initiative, position, and capacity to encourage their use. Many shared our views on the importance of flexible core support as critical for institutional health and policy influence, and of covering full costs so that great projects aren’t delivered at the expense of great institutions.

Likewise, we need funders who don’t take for granted that quality research will make its way into policymaking. Rather we encourage funders to be curious about the barriers that hinder uptake of evidence and invest in helping overcome them. Funders often orient portfolios around evidence types: sector-specific research projects or impact evaluations; open data, data science, or official statistics. Rarely do they look across their portfolios and consider supporting efforts that would allow all these programs to have more impact, such as helping strengthen the capacities, incentives, and institutional systems that determine whether the evidence they fund will get used by government officials to inform decisions.

Ideally this community of funders can also be deliberate about coordination and avoiding duplication. Our consultations revealed a small set of funding organizations interested in exploring light coordination, at the very least in sharing learning, keeping tabs on each other’s strategies and grantmaking, and maybe eventually co-creating some type of coordinated effort.

Our next steps

Where does all this learning leave us? We started this journey thinking about big new things we might do to strengthen the EIP field. We are ending it with a back-to-basics mindset and some humbler questions. We would like to hear from you about how these field-building basics relate to your work and how you respond to the questions below.

To us, the basics include:

Invest in institutions: Fundamentally, a field is a set of governmental and non-governmental institutions and the nature of how they work together. The heart of our work is supporting non-governmental organizations, many of whom work very closely with governmental institutions, to increase the use of evidence by governments. We will continue to provide flexible funding that allows organizations to respond to compelling opportunities, to start new initiatives and partnerships, and to position themselves to have lasting impact beyond the duration of individual projects. We will do a better job of supporting our grantees to capture and share their learning in ways that will benefit other researchers, practitioners, scholars, government technocrats, political leaders, and champions for evidence-informed policymaking.

Provide catalytic support: In addition to longer-term support for organizations, we can move fast, flexible, targeted resources in response to emerging opportunities, especially those related to advocacy, strategic planning, convening, and launching multi-stakeholder initiatives.

Make connections and foster learning: The Twitter mapping hints at the role that funders can play in making connections across communities. We will continue to connect excellent people and institutions that might benefit from knowing about each other’s work. We will do a better job understanding and using the language of others – bridging to them, rather than imposing our own new framing on them.

Find and support positive deviance: Rather than trying to orchestrate globally-coordinated initiatives, we can look for positive deviance that fortifies the field. We can identify and support those who are already bridging across communities, addressing the systemic barriers to evidence use, sharing frank learning, communicating well, taking the long view, or taking evidence principles into existing initiatives.

IDInsight, an organization supported by the Hewlett Foundation, hosts a presentation in Dakar, Senegal on lessons from their learning partnership with the Ministry of Gender in Malawi. Photo credit: IDinsight/Reagan Odhner

Our questions include:

Do you agree with these conclusions? Are there opportunities that you think we are overlooking? For those of you who hosted or participated in consultations, have we captured what you heard?

What do you think funders should do? Nearly everyone we consulted was excited about funders working more closely together. We have a tremendous amount to learn from and share with other funders, and we appreciate the benefits of convening, coordinating, and collaborating. As we’ve spoken with other funders, several have expressed interest. However, it is not obvious how to make the most of the opportunity. We also take seriously the responsibility of making any convening worthwhile and fostering shared ownership from the outset. What advice do you have for us? What funders should be around this table? What funder discussions and outcomes would be most worthwhile? What are the EIP opportunities for funders to accomplish more together than individually?

How can we support learning and connections that would enhance your work? We consistently see organizations wrestling with similar opportunities and challenges. When our grantees do connect, they frequently report learning, and sometimes develop partnerships. We also recognize that everyone (including us) is busy, so we want your candid feedback on how highly we should prioritize making these connections, and on the most efficient and effective ways we could support learning and connections within the field.

What is the best way to learn how governments institutionalize the use of evidence? By “institutionalizing evidence use,” we mean policies and practices, incentives, and structures that embed, encourage or require government officials to use evidence in their decisions and routine operations. For example, we are interested to learn more about incentives or requirements for government officials to produce data or evidence and use it to inform decisions, and about efforts to strengthen government capacity for evidence use. We are interested to learn if and how mandated, institutionalized use of evidence fosters greater transparency for how decisions are made and gives non-governmental actors more opportunities to engage in the policy process. We – and many of those we consulted with, are particularly interested in understanding the role that non-governmental actors can play in encouraging and supporting institutionalized evidence use within governments.

What could we, together with partners, do to enable more learning in this area? First, we may have to break down the question into more specific categories – for example, how executive branch agencies use data and evidence to formulate budgets or how line ministries evaluate programs and change them based on evaluation findings. This could make the questions more tractable, and also increase the chances that the emerging lessons are generalizable.

One option is to produce a set of case studies, for example, analyzing whether and how efforts to institutionalize evidence use helped improve policies and ultimately outcomes for people, and determining what role (if any) non-governmental actors played. Another option could be to do a landscaping of organizations or communities of practice (such as auditors general or civil service commissions) that are best positioned to advance this work. These ideas are purely illustrative.

In terms of format, we are wary of just commissioning another study. We have seen far too many studies, motivated by genuine curiosity about a field or topic, just sit on a digital shelf. We are open to creative and participatory ways to generate and share information such as sponsoring a strand of a research conference or a special edition of a journal. We are interested in your feedback about what might be most useful.

Are there ways to support better EIP messaging? In our consultations, while respondents agree that no single set of messages will work everywhere, many believed it would be useful to have better narratives about why evidence use matters. Some recommended a messaging framework that could be tailored to resonate with political actors and with citizens in various contexts. This messaging guide about ocean conservation offers one model. Does this seem valuable to you? Would you be interested in helping design and/or fund this work?

We appreciate your willingness to engage in discussions and debates about how to promote the use of sound analysis for decision-making in a world that so often seems resistant to reason. Please keep your ideas coming by sending feedback to eipfield@hewlett.org. Thank you!

Ruth Levine, former director of the Hewlett Foundation’s Global Development and Population Program, served as an important thought partner in our learning and collaborator on this piece.

Search Our Grantmaking


By Keyword