Grantee Perception Report: What we learned and how we’re responding to grantee feedback

Compass
Photo credit: Triff/Shutterstock

Since 2003, the Hewlett Foundation has regularly commissioned a “Grantee Perception Report” (GPR) to get feedback from grantees on our performance and our relationship with their organizations. It’s a critical part of our commitment to learning, and a method for ensuring that we’re building strong partnerships with the organizations we work with to make a positive impact in the world.

The survey is conducted by the Center for Effective Philanthropy (CEP), and contains a mix of quantitative and qualitative responses, benchmarked against our own past performances as well as the results of our peer funders who also work with CEP to administer the survey to their own grantees.

We’re grateful that so many of our grantees take the time to complete the survey. CEP’s report to us – which protects the anonymity of respondents to encourage candor – provides detailed information across a number of dimensions that we actively use to evaluate our work and make changes to our practices where necessary. In the latest report, we saw results that affirmed practices that we have put in place to be good grantmakers – and pointed to ways in which we can do even better.

What we learned

Under the heading of “what’s working,” CEP reports: “Grantees’ ratings of Hewlett’s impact on their fields continue to be more positive than typical, have significantly improved since 2015, and are now at the 80th percentile in CEP’s comparative dataset.” In particular, the foundation’s overall ratings on measures like understanding and advancing knowledge in grantees’ fields are high – rating in the top 15 percent of funders. These measures are strongly related to perceptions of field impact, and our programs are all committed to ensuring that we maintain the strong practices that enable us to understand the fields in which we work. We’ve made efforts to improve our grant procedures, ensuring that all aspects of the application, reporting, and evaluation processes have a meaningful purpose and make wise use of the time of foundation and grantee staff – and the results are visible in overall data: the amount of time that the typical Hewlett grantee spends on these processes has dropped by 33 percent. We’ve also made special efforts in recent years to increase our openness about our grantmaking approach, so we were pleased to see positive ratings in the overall clarity of communication about our goals and funding strategies. And we were particularly gratified to see significantly higher ratings since our last survey, in 2015, of the foundation’s openness to ideas from grantees.

Grantees gave us plenty to chew on, as well. The foundation has already begun to engage in a conversation and develop new guidelines that reflect our intention to pay both the direct costs and our fair share of the indirect costs in our project grants; the GPR underscored that this is a key concern for many grantees. Similarly, we’ve begun to place greater emphasis on understanding the needs of the people our grants are intended to benefit – and our grantees see this as an area of opportunity. Grantees are also eager for the type of support that we provide that goes beyond the grant dollar – facilitating, for example, grantee convenings that can spur collaboration and learning across networks. While nearly two-thirds of grantees surveyed said they are familiar with how the foundation is adapting amid a changing political landscape, a strong plurality – 40 percent – were eager to discuss further with their program officer how these external conditions affect their work. And perhaps there’s a connection between these two ideas: As one grantee commented, “The foundation’s engagement in brokering dialogue and the exchange of ideas across the political spectrum is extremely important, particularly in these highly divisive and partisan times.” We hope we can find more ways to bring together people and organizations who genuinely care about finding solutions to society’s most pressing problems.

How we’re responding

Each of our programs is taking a variety of actions in response to the results. Some of these are described below, but we encourage any grantees interested in learning more about the individual plans of their program to reach out to program staff directly.

Clarifying best practices. Many of the opportunities for improvement lie beneath the positive foundation-wide averages, which mask important variation on specific measures across – and within – our grantmaking programs. The foundation’s overall ratings on transparency, for example, are quite high, but grantees of individual programs report varying experiences. This is true for overall measures of the strength of grantees’ relationships with the foundation, measures of responsiveness, and more. To be sure, there are differences in the ways that our programs build relationships with grantees that are appropriate and reasonable, arising from the variety that exists in these organizations and the duration of our relationships and involvement in their fields. Even so, we think there’s probably a “North Star” that will help set norms and guide our practices to achieve the type of strong relationships that we know are essential for us, and our grantees, to succeed.

Ruth Levine, director of our Global Development and Population program, provided an apt description as she reviewed her program’s GPR results:

In a program as large as Global Development and Population, with about $100 million in grantmaking annually to organizations in about 20 countries under five distinct strategies, the aggregate story masks important differences across grant portfolios. As we reviewed information about each of the program subcomponents, as well as written comments from respondents, we saw a helpfully nuanced picture.  Specifically, the grantmaking in Transparency, Participation, and Accountability and U.S. and Global Reproductive Health, where we have had decades of engagement, is viewed as more in tune with those established fields than what we’re doing around Evidence-informed Policymaking and Women’s Economic Empowerment. The latter portfolios are much newer and are operating in interdisciplinary domains. Our relationships with grantee organizations are relatively new and often in the form of projects (which correlate with less grantee satisfaction than general operating support grants); possibly as a consequence, we heard back from some grantees that they don’t feel that the relationship is a strong one and they don’t believe we understand their organizational concerns. There also appears to be a disconnect between the way the grantees perceive their own fields and the way we conceptualize the scope and direction of the work.

The question for us, then, is how do we accelerate the process of learning about the field and building relationships where we are more recent entrants? The answer may lie in some of the actions that have worked quite well for us in the past and that we could do more systematically.

A new internal, foundation-wide project launched in January will seek to identify such best practices across the institution. In addition to analyzing the GPR results themselves, we are collecting and synthesizing lessons and approaches from our peers and from sector organizations, and interviewing staff across the organization. We’re not sure where we’ll end up – maybe a kind of “Seven Habits for Highly Effective Program Officers”? – but we know it’s an important step toward improvement.

Improving and streamlining processes. The latest GPR results have illuminated opportunities for some of our programs to improve grantees’ experience by refreshing and streamlining processes. Our Education Program, for example, plans to take a fresh look at its application package, including reviewing and streamlining reporting requirements to ensure that we are not asking for information that does not shed light on key outcomes or help us gauge progress. The program also plans to take steps to improve procedures for processing grant reports so that grantees are not delayed in executing their work. Our U.S. democracy program, the Madison Initiative, plans to further refine its reporting process to make it a more helpful opportunity for grantees to reflect and learn, and sees room to improve in grantee engagement on evaluation-related matters – e.g., improving how we solicit grantee input on the research designs of our evaluations. Our Performing Arts Program has already taken action based on negative feedback about one element of its grant application, which grantees referred to variously as a “grid,” “matrix,” “outcomes chart,” “evaluation chart,” “logic model,” and “grant chart.” Whatever term used, grantees felt that this element was labor-intensive and unhelpful – and the program has since removed it from the application altogether. Emiko Ono, the program’s director, pledges, “In 2019, we will determine, in conversation with grantees, how we might gather information about how organizations determine and monitor success in a way that is of minimal burden and maximum usefulness to grantees.”

Managing staff transitions. The GPR also revealed some of the challenges associated with the foundation’s practice of eight-year term limits for program officers and program directors. Our Global Development and Population Program observed that in cases where there had been a change in grantees’ point of contact during the two years before the survey, some grantees expressed concern about changes in strategic direction, as well as shifting expectations around communication between program staff and grantees. While we believe term limits are good for foundations and grantees, we know they can create disruption. We can and will strive to do a better job of ensuring continuity when personnel changes occur – including following up to make sure grantees have received and understood the information we’ve shared about the transition process.

Maintaining more open communication. While the foundation drew strong overall scores for transparency, and clarity of communication, there’s more we can do to ensure that a greater number of grantees experience this openness, and – in turn – that more grantees feel comfortable being open with our program staff. Our Global Development and Population Program is looking to expand practices that have received positive feedback from grantees on this front, and also help accelerate our own understanding of our grantees.

These practices include eliciting and responding to feedback on consultation drafts of strategy documents; issuing portfolio-specific updates; bringing individuals from grantee organizations together to discuss issues of common interest; and making sure program staff make time for site visits and engagement in important gatherings among fellow funders. Our Environment Program is looking for ways to ensure that our communications about grants and grant strategy are clear and our responses timely, and to increase overall direct grantee communications. The program also heard that we need to do better to understand local communities, as well as to understand the organizational elements and challenges – not just the substantive work – of grantees. In response, we will be offering more regular “all-grantee” calls to discuss our ongoing work and strategy; supporting more convenings on individual issues that enable us to meet with, and learn from, more grantees; and expanding our level of direct grantee outreach to learn about grantees’ ideas and concerns during strategy development. We are also putting additional resources into support for grantees looking to strengthen their organizations internally, including support to help grantees make their workplaces more equitable, inclusive, and diverse, and support to help grantees build their communications capacity.

The benefits of listening to feedback

The latest GPR results also provide measurable evidence that if grantees provide – and we listen to – open, candid feedback, we can build stronger relationships that are the hallmark of effective grantmaking. In our latest GPR, for example, our Effective Philanthropy Program, saw improved ratings on several dimensions. Those gains did not come by accident, but rather from a concerted effort to take specific actions based on results from the 2015 survey. When grantees provided the feedback that we could better understand their overall goals and strategies, our program staff responded by adopting a more deliberate approach during site visits and phone calls to build this understanding. This not only resulted in much higher ratings on that individual question, but also helped build stronger overall relationships, greater awareness of grantees’ constraints and opportunities, and informed ways that the foundation could assist further in building networks and capacity. Similarly, our Madison Initiative saw notable improvements from the prior GPR survey based on specific changes to grantee feedback, such as streamlining the initiative’s grant application and reporting processes. The initiative’s shift in 2016 to provide a greater share of multi-year, general operating support grants, which support the sustainability and initiative of our grantees, also changed the quality of its relationships, leading to improved ratings in the latest survey.

We’re hopeful that in a few years, when we turn to our grantees once more, we’ll see similar improvements in response to the changes we’re making now – and we’re cognizant that there will likely be new concerns that will require action. Our philanthropy is, as ever, a work in progress, requiring ongoing attentiveness to ensure that we’re learning and improving so we can make a positive impact in the world.

Search Our Grantmaking


By Keyword