In a world where both resources for policy research and the attention spans to take it in are finite—that is, in the real world—less can definitely be more. Less research can mean that there’s more money and time for the activities other than data collection, analysis, and synthesis—more time and money, that is, for the activities that can make the difference between the proverbial “report on a shelf” and the study that informs better policy decisions in a meaningful way. Less sophistication can mean more understanding: Asking simple, straightforward questions and using descriptive data can provide just the type of information that decision makers understand and value. Less verbiage can mean more reading: writing shorter documents focused on questions the intended audience actually wants the answers to can dramatically increase the likelihood that they will be read.
The tradeoff we see most often is between doing more research or spending more time on engagement with journalists, advocates, policymakers, and others who might interpret and use the findings. In general, researchers maximize budgets and schedules for the research, and shortchange the activities that help ensure that their research is relevant to current policy debates, shared in the many venues and formats that are needed to achieve real impact, beyond any research publication. As funders, we frequently ask how members of the policy community will be engaged from the outset and how the work will be disseminated in a way and at a time that corresponds to the intended audiences’ needs. Too often, the answers are vague, with little evidence that the proposed budget could accommodate the significant amounts of labor, travel, coffee-and-sandwiches, and other quotidian, essential costs for policy outreach.
We also often see researchers reaching to explore ever more nuanced policy questions and applying sophisticated econometric and other abstruse techniques. It’s impressive, and may be just the ticket to get the resulting paper into a prestigious journal (or at least into a years-long cycle of revising-and-resubmitting). But more often than not the analyses that serve policy audiences are those that simply and compellingly bring to light facts about the conditions of people’s lives, the quality of public services, and the potential costs or savings from a particular government program. That is, the studies that present descriptive and basic analytic results in straightforward ways that connect to specific policy domains and decisions—the kind that a technocrat in the Ministry of Health, Education, Planning, or Finance might need to come up with a better program design and stronger budget request.
The final way that less can be more is in the presentation of findings. For their academic and think tank peers, policy researchers feel compelled to “show their work,” sharing all the details of study design, conceptual framework, analytic approach and—where they sometimes lose even me—the multiple specifications of the multivariate models they tried before landing on the “right” one. This adds up to far too much information for policy audiences, who are likely to tune out at the first mention of “sample size” and “statistical power.” What most people in the advocacy and policy community want to know is: Why is this important? What did you find? How does this fit into what we know from other sources (i.e., does this challenge or confirm conventional wisdom)? And, crucially, so what? What should we do differently now that we have these results? Researchers who can answer those questions succinctly and precisely are able to attract and sustain attention—and win our admiration for their communication skills.
As I’ve written before, we believe in the power of evidence to improve people’s lives, and we commit millions of dollars to specific studies and to policy research institutions. We want those dollars, in aggregate, to have the greatest impact they can—but that won’t happen until the researchers themselves do less to do more.