2015 was a great year for people who believe that openness can drive change. It was a year in which governments around the world, including many here in the United States, embraced new policies requiring open licenses for work they fund. Major private funders did the same, like Gates and Ford (in addition to us), starting a trend that should bring open licensing to whole new fields of study. Best of all, 2015 was the year we began to see open educational resources (OER) move from eccentric curiosity to mainstream adoption.
At the root, or maybe I should say the heart, of these successes lies the work of Creative Commons. The organization’s open licenses—now used in connection with more than a billion works—provide the legal underpinning for sharing. More important, its leadership has been pivotal in shaping a new understanding of the importance and feasibility of making art and research and all manner of information easily accessible and useable.
When Creative Commons was founded, in 2001, the notion that content creators would routinely allow others to use, modify, and even commercialize their work seemed far-fetched. Today, that idea is nearly conventional wisdom—and on the way to becoming conventional practice. The journey has at times been arduous, but open licenses, and the idea of openness more broadly, have succeeded due in large part to the tireless efforts of the people at Creative Commons.
The Hewlett Foundation was an early supporter of Creative Commons, and we have watched its growing influence with pride and admiration. It has been a privilege to support this vital work, and we are pleased to continue that support with a new grant of $10 million in unrestricted general operating support—the largest gift we have made to the organization.
Our hope is to give the people at Creative Commons some room to breathe: an opportunity to think and try new things—to create—without having to curtail or constrain their ongoing work. Creative Commons is the chief steward of a large and growing movement for openness, a movement to make knowledge more freely available, to foster sharing and collaboration, and to spur advances and improvements that make the world a better place for everyone.
One of the best things about working in philanthropy is the chance to get in on the ground floor of great ideas. Even better is watching those ideas grow into something unanticipated, something bigger and better than anyone foresaw or expected at the outset.
Such has definitely been our experience with Open Education Resources (OER)—a field launched at MIT in 2001 with a joint grant from the Hewlett and Mellon foundations. That grant funded MIT’s OpenCourseWare initiative, which made materials from the university’s classes freely available to people around the globe. It was a daring, wonderful, ambitious idea for making knowledge that until then had been confined to campus available to a literal world of people who otherwise could never have dreamed of accessing it. The sheer audacity of the vision energized us: so much that we launched a new grantmaking strategy to grow and spread this new idea. Yet even so, I doubt anyone at the time envisaged the robust, burgeoning field that OER has become a decade and a half later.
That field still includes MIT’s vibrant OpenCourseWare, which today averages a million visits each month. But it also includes a host of other organizations the Hewlett Foundation has been privileged to support and, in many cases, to help launch—such as Creative Commons, whose open licenses now ease access to over a billion pieces of content that otherwise would be locked down tight; or OpenStax at Rice University, whose free and open textbooks are used, reused, and repurposed in more than 1,000 college courses around the world. More gratifying still is how far the OER field has grown and matured beyond what we at the Hewlett Foundation support. Other funders have joined, (though still too few given the field’s needs and opportunities), and public agencies have begun stepping up as well, like South Africa, whose government has distributed open textbooks to every public high school in the country. A study we recently commissioned from the Boston Consulting Group indicates that approximately ten percent of K-12 educators in the United States now use OER as a primary resource.
OER didn’t grow from a single experimental grant to a global movement overnight. It developed gradually, as a small but steadily increasing cohort of farsighted educators came to see its benefits, which are broader and more profound than many people realize. After all, OER can do much more than just reduce costs, though certainly it does that. Openly licensed materials empower teachers: unleashing their creativity by enabling them to customize courses in ways that deepen both their own engagement and that of their students. And the benefits for students are even greater. Clearly, open resources increase students’ access to knowledge, just as the recipients of that first grant at MIT hoped they would, but there is growing evidence that effective use of OER also improves critical student outcomes—everything from test scores to college enrollment rates. “No brainers” are incredibly rare in education, where strongly held, widely disparate values all-too-often stymie potential reforms. Well, OER is a no brainer.
Still, that ten percent of K-12 educators are using OER means 90 percent still are not. Can we change that? Can OER become the materials of choice for mainstream teachers? Just a few years ago, when OER was still in its infancy, such a question would have seemed ludicrous. Even now, the idea that OER could be the norm seems crazily ambitious. But research in other domains has shown that innovations often tip from early adopters to mainstream use once they hit 15 to 20 percent of market share—a level of use that seems sufficiently widespread to get all but the stodgiest holdouts to consider trying something new. We are, in other words, already close. Our refreshed grantmaking strategy, which we release today, is designed to reach that next, critical tranche of potential converts. It will do so by focusing on particular problems and challenges for which OER provides an especially promising solution—from the out-of-control rise in the cost of textbooks to the provision of educational materials in the developing world. We will, at the same time, continue to support the sort of robust infrastructure needed for OER to facilitate still more innovations in educational practice.
The benefits of using OER in the classroom are too great to be embraced by only a forward-thinking few. They should be shared with everyone.
Earlier this summer, we held a pair of conference calls that offered grantees an open forum—a kind of philanthropic town hall—to question me and our senior staff members about anything and everything. The calls, which we hope to hold annually, are part of our ongoing effort to be transparent and to create open lines of communication with grantees so we can learn and better meet the needs of the communities we all aim to serve.
Participation was open to grantees across our program areas, and the 400 or so people who called in reflected the full gamut of issues we work on, including conservation groups and local arts organizations in the Bay Area, bipartisan think tanks in Washington, development groups in sub-Saharan Africa, and more. Given this diversity, we weren’t surprised to get some specific questions about individual lines of grantmaking. What was a surprise—though in hindsight perhaps it shouldn’t have been—was to see how many of the answers to those questions were nevertheless relevant across our programs.
For example, one participant asked about our commitment to holding the rise in average global temperature to less than 2˚ Celsius, a longstanding goal of our climate work that seems increasingly difficult to achieve. Environment Program Director Tom Steinbach provided an answer that sheds light not only on our climate strategy, but on our approach to goal-setting generally. Tom thus explained how he and his team have considered carefully whether the 2˚ goal is still the right one by talking to grantees, fellow funders, and other experts. He cited two reasons for concluding that it is: first, because the goal reflects what the best scientific evidence continues to tell us is needed to avoid the worst effects of climate change; and, second, because it provides an ambitious aspiration worthy of our—and our grantees’—efforts. Similarly, a grantee of our Madison Initiative (which focuses on reducing the effects of political polarization in Congress) asked how to connect with others working on the same issues. The answer, from initiative Director Daniel Stid, highlighted the value of grantee convenings, which all our programs arrange as a way to encourage formal and informal networking and collaboration as well as the sharing of knowledge.
A few questions came up more than once, indicating their salience to grantees working across program areas. For those who weren’t able to join the calls, here’s a quick recap of answers to some of these key questions:
Can you share any tips or best practices for grantees to keep in mind in preparing for the transition of Program Officers and Directors?
This question came up on both calls and was also emailed to us in advance of the calls by grantees in a couple of program areas. Hewlett Foundation program officers and directors serve eight-year terms, baking a certain amount of regular transition into the way we work. As my colleague Ruth Levine, director of our Global Development and Population Program, noted in a recent blog post, “[W]e have more planned transitions than most organizations. And we have built some practices over the years to mitigate risk (and hopefully reassure the anxious) on all sides.” Ruth’s post provides more detail on practical steps we’ve taken to ensure smooth transitions—from documentation to orientation processes. More generally, though, it’s important to understand that staff transitions at the Foundation do not trigger radical shifts or the abandonment of a line of grant making. New staff bring new ideas and new ways of thinking to the work: that’s one of the key benefits of having term limits. But our strategies exist independently of the staff members who conceive of and execute them, and we are equally committed to sensible continuity. So my best tip is not to think of such transitions as events for which “preparation” is needed. Instead, communicate: get to know the new officer or director and begin to develop the kind of working relationship and partnership that helps both you and us. Lines of communications between program staff and grantees should always be open—especially around staff changes—but please know that we’ve given a lot of thought to how best to execute these transitions, with minimal, unnecessary disruption to our support for grantees.
How are you thinking about collaboration and synergies across your different program areas?
Several grantees asked a version of this question about the Foundation’s approach to collaboration between and among our programs. As I noted on the call, there are definitely examples of programs partnering on individual grants or strategies. For example, our support for efforts to improve children’s education in developing countries benefits from the input of staff from both our Education and Global Development & Population programs, which share an interest in the topic. We do our best to spot opportunities for partnership, and our senior staff is eager to seize opportunities that crop up. However, because our programs pursue specific strategies in their different areas, which are substantively disparate, and because we try to avoid creating unnecessary bureaucracy, we have not formalized collaborative structures across programs.
What are you priorities for funding in the future?
Our main program areas—education, environment, global development and population, performing arts, and philanthropy—are longstanding areas of concern for the Hewlett Foundation, and our commitment to them is undimmed. Of course, we strive also to be nimble and responsive to changing circumstances and new opportunities, which we do by launching or shifting strategies within these programs; by inaugurating time-limited special initiatives (like our recently announced Cyber Initiative), and through our Special Projects program. Still, our signature programs have roots that stretch back to the beginnings of the Foundation—to what Bill and Flora Hewlett cared most about—and these remain our enduring priorities.
* * *
To all those who participated in the calls: Thank you! We learn whenever we hear from you, and appreciate the opportunity to do so. We hope you found the conversation as useful and informative as we did. For those who weren’t able to join, don’t be shy! You can always ask whatever you need to know and share whatever is on your mind. Like I said above, the lines of communication are open.
Funding research is an element of nearly all our grant making programs and strategies—running a wide gamut that encompasses everything from ascertaining who takes advantage of the Bay Area’s diverse performing arts to quantifying women’s contributions to the economies of sub-Saharan Africa.
Research matters because knowledge is a powerful force for change, and high-quality evidence and analyses are indispensable to sound decision-making. The Hewlett Foundation thus has a vital interest in ensuring that the research we fund is both rigorously done and widely shared—which is why, for example, we decided last year to openly license our own work and to require grantees who receive project-based funding to openly license the work so produced.
We are equally concerned that any research we generate or support be done according to the highest ethical standards, especially when it involves living subjects. And while one might assume that this could or should take for granted, we think it important to say so clearly and unequivocally. Beginning this month, then, we will be adding language to our grant agreements stating that grantees doing research with human subjects “must have appropriate standards to ensure compliance with generally accepted research ethics,” through the use of institutional review boards, informed consent policies, and the like. We are, in addition, asking grantees to warrant that such rules and processes will be followed and will require that grant dollars be returned if either of these conditions is not met.
These changes are not meant to impose needless burdens on our grantees. To the contrary, the new language states expressly that we do not wish to “micromanage or seek to interfere in the implementation of grants”—which is why we have not specified a particular review process beyond requiring that the one in place meet generally accepted ethical norms. Each grantee organization should be free, within that constraint, to develop the rules and processes that best suit its culture and capacity. By affirmatively stating our expectation that appropriate rules and review processes are in place and be followed, however, we hope to signal the importance of research ethics.
One of the things I’ve found most striking about working in philanthropy is how much time we spend focused on what we do wrong, what we could do better, what we’re not achieving, and so on. To be honest, it can be a little deflating, especially in a field like ours that can and ought to be uplifting 24/7. Fortunately, every once in a while, one does get the opportunity to read something really inspiring—the kind of thing that reaffirms one’s pride about working in this sector, with these people.
I’m referring to the letter published today by Darren Walker, president of the Ford Foundation, on “What’s Next for the Ford Foundation?” I won’t rehearse the letter’s content here—you should just read it. Darren does a wonderful job knitting Ford’s great past together with its present and future aspirations, underscoring the foundation’s continuing commitment to social justice and equality, and sketching out important shifts in how the foundation will do its work going forward. One of the criticisms most often leveled at big foundations is that they can’t or won’t change. Certainly that’s not true here. The sorts of changes Darren describes, particularly in how Ford makes grants, will not be easy to implement. They are, however, the right kinds of change—not just for Ford, but for the whole sector. I was particularly excited to see Ford’s commitment to “a concerted effort to support stronger, more sustainable, and more durable organizations,” including through making “larger, longer-term grants that can be used more flexibly.”
I, for one, am eager to follow “what’s next.” The Ford Foundation’s storied past gives it a special place in American history and U.S. philanthropy. Under Darren’s leadership, its future looks bright to preserve that place. And its success will be everyone’s success.
When I joined the Hewlett Foundation in 2013, a renewed attention to transparency was high on my list of priorities. The Foundation has long been committed to openness and sharing information. As my very first post for this blog noted, we were among the first foundations to publish the results of our Grantee Perception Reports, and we have made a practice of sharing information about our grantmaking over the years. And we went still further after I arrived, sharing things like the grant descriptions we provide to our Board and a detailed study we prepared for them of grant trends in the past decade. We launched this blog, in fact, to offer Hewlett staff members a way to share their thinking, so others would know what we are up to and could challenge us.
But philanthropy is—or should be—all about learning, and we’ve learned that there really can be too much of a good thing. Even when it comes to transparency. We are thus replacing, or perhaps I should say modifying, our Transparency Initiative, which will now become our Translucency Initiative.
Quite simply, over the past few months, my colleagues and I have grown concerned that we may have gone too far with all this sharing. We wonder whether providing so much information just adds to the noise—pouring more and more data into the ceaseless flood of infographics, spreadsheets, and “must-read” thought pieces that we all confront each day. I confess that sometimes, staring at my ceaselessly replenished inbox late at night, I feel a little overwhelmed by it all. And I’m not even on Twitter!
So rather than continue contributing to the endless stream of humdrum data, we’re ready to make a change. We’re still committed to sharing information about our work, of course, but we’ll do so in ways that are hopefully easier for us to manage and you to digest. With apologies to the Foundation Center, we think of it as having “frosted glass pockets.”
To give you some idea of what this will look like in practice, I’ve included an image of our reimagined grants database, which reflects our new policy of translucency. We asked ourselves: is there really that much difference between a $200,000 grant and a $250,000 grant? As you can see, in the new database, grants of both sizes will be categorized as “A lot of money.” All grants over $5 million will simply be listed under “This better work.”
Our new translucent grants database.
We’ll do something similar for evaluations of our strategies. Rather than diving deep into the weeds of performance indicators, metrics, M&E, and ROI, we’ll ask our evaluators to share their findings using a clear three-point scale: “Getting out the checkbook,” “Meh,” and “Let us never speak of this again.” Simple, wouldn’t you agree?
Consistent with our new commitment to translucency, I’ve shared only the broad, somewhat fuzzy (but colorful!) outlines of our new policy. There’s much more to it, of course. How could there not be? We’re still a foundation, after all.
If you simply must know the intricate details of our current thinking on the topic, I encourage you to read the whole darn thing.
Members of the Nairobi (Kenya) Young and Old cooperative group gather in their small center to make products to sell. (Photo Credit: Jonathan Torgovnik/ Reportage by Getty Images, licensed under CC BY NC 4.0)
Among the Hewlett Foundation’s oldest, strongest, and most enduring priorities has been to help women gain control over crucial decisions in their lives. In the beginning, our attention was on expanding access to family planning services, so women in the U.S. and in developing countries could control the number of children they bore—benefitting not only the women themselves, but their families, their countries, and the environment.
That commitment remains as strong as ever, but our vision has broadened with experience. Under the leadership of Anne Firth Murray (who directed the Foundation’s Population Program from 1978-1987 and later was the founding president of the Global Fund for Women), the Hewlett Foundation widened its focus and placed family planning within a broader framework of women’s health and human rights. That vision continues today to inform the work of our Global Development and Population Program.
But learning never stops, and with enthusiastic encouragement from our Board, the Global Development and Population Program has fashioned plans for a new line of grantmaking—expanding our efforts still further to encompass advancing women’s economic opportunities in developing countries. The importance of economic opportunity in promoting the values that have always motivated our efforts to support women is now evident, a connection established by decades of social science research as well as our own experience.
I encourage you to take look at a more complete description of the new strategy. Our concerns are not the usual ones—microfinance, vocational training, and the like. Such efforts can be important, but we hope to empower women by showing economic decision makers in international agencies and developing countries how women contribute to growth and how they are affected by relevant policies, from taxation to employment regulation (Ruth Levine’s Friday Note from last week addresses this). We want to see better data, focused and relevant research, and informed advocacy, so the true value of women’s economic contributions can and will be fully and properly recognized—and further contributions encouraged.
Like our continued investments in expanding access to contraception and safe abortion, we believe these sorts of changes will help women realize their full potential as citizens, as workers, as parents, and as people.
Last March, our Board approved a new “Cyber Initiative” with a budget of $20 million over five years. The Initiative aims to build a field of policy analysis for problems relating to security and technological trustworthiness on the Internet. While government and industry are both already spending vast sums of money to deal with such problems, their focus is overwhelmingly on present needs and problems and mainly involves developing technologies to combat hackers, thieves, and enemies. Hardly anyone is thinking about the lasting consequences of today’s solutions, much less about developing overarching policy frameworks for long-term global governance and security.
The importance of having such frameworks cannot be overstated. Our lives increasingly depend on the Internet, and choices we are making today about Internet governance and security have profound implications for the future. To make those choices well, it is imperative that they be made with some sense of what lies ahead and, still more important, of where we want to go. Yet little or no thought is being given to such questions, partly because of avoidable obstacles. At present, few institutions treat questions of cybersecurity and Internet policy as a central or even important focus of their work. Individuals with proper training to address these questions are in short supply, and those who exist seldom speak to each other or share information. Nor has anyone given them reason to do so: funding for this sort of work is practically nonexistent.
The Cyber Initiative seeks to overcome these obstacles, and, in so doing, to build a “marketplace of ideas” about cyber policy—generating the kind of robust arguments and analytic frameworks needed to begin articulating sensible long-term public policy. We plan to do this by (1) supporting and/or building dependable, independent institutions capable of training, nurturing, and supporting experts with a sophisticated understanding of the problems; (2) convening experts from government, industry, academia, think tanks, and other arenas to share information and develop the trust needed to work collaboratively; and (3) attracting additional funders to help grow and develop the new field.
The announcement of the Cyber Initiative prompted a great deal of commentary about the need for, and importance of, our plan—much of it likewise emphasizing the absence of serious public policy analysis. These reactions came not just from potential grantees (whose statements may perhaps be taken with a grain of salt), but also from people working in government, industry, the media, and philanthropy. Our specific focus—creating opportunities for people coming from different sectors and disciplines to exchange information and work together, and developing multiple long-term policy options—was singled out for particular approbation. Given the modest size of the initiative, the attention it garnered came as a pleasant surprise, but it also says something about timing: public policy for cyber is a field ripe to be built.
Even so, building the field will not be easy—and not just because of the modest scale of our initiative, which is only $4 million per year for five years. As we explained to the Foundation’s Board in March, our plan has never been to build the field ourselves. Rather, we intend through our grantmaking to demonstrate what is possible, while working to attract additional funders and funding into the field. The reactions to our announcement were heartening precisely because they confirmed our sense that there is widespread interest and curiosity. The bigger problem turns out to be that the field is so underdeveloped that neither we nor other potential funders have adequate places even to begin.
An unanticipated partial solution to this last problem emerged during the summer, with the discovery that the Foundation needs to pay out more this year than originally budgeted if we want to keep the excise tax on our earnings at one percent. The Board agreed in July that we should do so. In September, I updated the Board, explaining that the Foundation needs to spend as much as $50 million before December 31 for this purpose; at the same meeting, the Board agreed to allocate $5 million of this amount for humanitarian aid in connection with the Ebola outbreak in Africa. Those funds are in the process of being disbursed.
At the time of the July meeting, the Board agreed to consider using the remaining funds in connection with the Cyber Initiative. More specifically, the Board gave permission to ask three Universities—Berkeley, MIT, and Stanford—to submit proposals for grants to establish multidisciplinary, public policy programs focused on cyber issues broadly understood.
It is worth briefly recounting the reasons behind this decision. As we discussed in July, establishing a number of strong academic centers will powerfully kickstart our Cyber effort—making a potentially transformative difference in launching a new field of public policy analysis. These grants will create critical centers of excellence that give other funders places to start building and enable us to use our limited resources more effectively. And if prior experience is any guide, we can expect other universities, think tanks, and funders to follow suit in launching their own cyber efforts.
A second, more easily answered question is, why Berkeley, MIT, and Stanford? To begin, it makes sense for us to start with major research universities. Eventually, we will need to support think tanks and other potential homes for policy development, including institutions that can attract participation from quirky and unorthodox technology types. But universities are likely to remain the foremost centers for developing long-term policy analysis, especially as our goal is to support analysts who are independent of both government and industry. Universities will also remain the key destination for training new people. Finally, and especially relevant for present purposes, major research universities are among the relatively small set of institutions capable of absorbing and making good use of $15 million grants in short order.
There were, of course, other universities to consider in addition to Berkeley, MIT, and Stanford. Based on our research, however, the three schools we selected seemed like the most promising places to start. All three have concentrations of world class faculty and graduate students working in a variety of relevant programs and centers scattered across their campuses—programs and centers that could and should be aligned to work collaboratively on cyber issues. What they have lacked are the resources and the impetus to do so, which the proposed grants will supply.
The full memoI shared with the Hewlett Foundation’s Board last month (from which this blog post is adapted) to provide them with the information they needed to approve these grants is now available on our website. It has more details about the proposal development process and the content of each school’s proposal, as well as thoughts on how we will measure success and the benefits and risks associated with making the grants.
If you’re interested in learning more about the thinking that went into making these grants, I hope you’ll take the time to read the whole thing.
Every November, the Board of the Hewlett Foundation authorizes a budget for the upcoming year, and, as part of that process, reviews what progress we have (or have not) made in our grantmaking strategies during the preceding year. As this requires Board members to absorb a great deal of complex detail, last November we rolled out a new version of the Board Book, designed to make the material easier to follow. (June Wang wrote a post about the Board Book redesign for our blog.) The revised Book included, among other things, a new “overview” that presented data for the past five years on the number of grants, their average size and duration, and the percentage that were for general operating support (GOS). Here is what the Board saw:
These figures raised questions for a number of Board members, who found them surprising in certain respects. Several asked whether our grants had become smaller in amount and/or shorter in duration than they used to be. Others wondered if we were drifting away from the Hewlett Foundation’s longstanding preference for GOS. Still others remarked that it was hard to draw conclusions without seeing the data broken down by program. They asked for a more thorough analysis of our grant trends.
The Board’s reaction stimulated a robust conversation among the staff. Had our grantmaking changed in ways that ought to concern us? Have our grants become smaller or shorter or both? Have we moved away from the tradition of helping institutions through general operating support toward a more controlling emphasis on discrete projects? If so, have these changes affected our staffing or the way we work?
Answering questions like these, we soon discovered, is anything but straightforward. On the contrary, our efforts to do so simply raised more questions. For example, the data we used in November presented GOS in terms of the number of grants, which can be misleading because GOS grants tend to be larger and therefore made to relatively fewer organizations. Would it be more accurate to measure GOS as a percentage of grant dollars? Should the data include Organizational Effectiveness grants, which have become numerous in recent years as part of a concerted effort to help grantees, but are—by definition—small and for a single year? How should we classify something like the extraordinary $500 million ClimateWorks grant, which was GOS and paid out over five years, but booked entirely in the year it was made (thus overstating GOS for that year and understating it for the following four)? Similar complications presented themselves when we focused on other measures, like grant size or duration. Even veteran program staff were surprised by the number of potential variations and complications that emerged in our conversations.
We concluded that a more thorough analysis of our grant trends was called for. To that end, we enlarged our review to cover the past ten years, instead of five. Beginning in 2004 made good sense: by then the Foundation’s endowment had recovered from the bursting tech bubble and incorporated the assets of Bill Hewlett’s estate, and the first stabs were being made to formulate and implement Hewlett’s distinctive brand of “outcome-focused grantmaking.” In addition to making it a ten-year review, we asked the programs to make separate presentations to explain how and why their grantmaking evolved as it did, incorporating a narrative alongside the statistics. We gave each program thirty minutes with the Board at our July meeting, during which they walked through the past decade of grantmaking and described the kinds of things that had shaped their particular outcomes. The memos they prepared for this purpose are included in my annual letter for 2014.
My task, at the conclusion of these presentations—which we interspersed with other business over the two-day meeting—was to draw things together and make some sense of the overall picture that emerged, if one emerged. (It did.)
My full letter shares what we found. I hope you’ll take the time to read the whole thing.
At bottom, philanthropy is about finding good ideas and providing the resources to see them tested, improved, implemented, and, if all goes well, brought to scale. Good ideas are essential but not by themselves enough. Even the best idea fades away without proper support, without a plan for making sure the right people hear about it, without effective advocates to press for its adoption. We do our best to use the resources at our disposal to help spread good ideas. And while it may sound peculiar to hear this from an organization with an endowment in excess of $8 billion, the fact is that there is only so much we can do by ourselves, and often it’s not enough. We need to find other ways to get the good ideas we support the widest possible hearing.
The changes we’ve made in our practices related to openness and transparency are in service of this goal—sharing what we learn with others so they can build on our successes (and avoid our failures). Now it’s time to take that one step further.
The Hewlett Foundation has for many years supported open licensing—a simple way to displace traditional copyright that facilitates and encourages sharing intellectual property. Grants to organizations like Creative Commons, which established and maintains a set of these open licenses, and to the many nonprofits that have received funds as part of our Open Educational Resources strategy, have helped to create the legal, cultural, and intellectual infrastructure for more open sharing of ideas. And we have long made information about our grant making available under one of Creative Commons’ licenses (as you can see by clicking the link at the bottom of this, and every, page of our website).
The benefits of open licenses are clear, and they are substantial. Reducing the burdens and removing the risks associated with ordinary copyright—making it easier for others to use, share, and build on work—magnifies the impact of new research and good ideas. Everybody wins.
For that reason, beginning this year we will ask grantees to license materials created with our grant dollars. More specifically, the Hewlett Foundation now requires that grantees receiving project-based grants—those made for a specific purpose—openly license the final materials created with those grants (reports, videos, white papers, and the like) under the most recent Creative Commons Attribution license. We also will require that the materials be made easily accessible to the public, such as by posting them to the grantee’s website. These requirements do not apply to grants made for general operating support of an organization or a program or center within an organization, because they are incompatible with the nature of general support. We do very much hope, however, that the positive experience of openly licensing materials created with project-based grants will lead grantees to do so for all their work.
That last paragraph comes from a new document in the “Values and Policies” section of our website, Commitment to Open Licensing. It explains why we are making this change, what the new policy covers, and how we intend to implement it.
We recognize that any time we place new requirements on grantees, we’re asking them to changes practices, and we want to be sure that the benefit from the change outweighs the cost of making it. To that end, we’re rolling this new requirement out to our programs slowly—we’ll continue to refine our internal practices, learn from how grantees respond, and make adjustments as needed. If the open license we suggest isn’t appropriate for a particular grant, we’ll work with the grantee to find one that is.
As with all of our efforts aimed at increasing transparency and openness, we’re making this change because we believe that this kind of broad, open, and free sharing of ideas benefits not just the Hewlett Foundation, but also our grantees, and most important, the people their work is intended to help.
Solving the kinds of challenges the Hewlett Foundation chooses to address requires good ideas, but ideas are not enough. Asking grantees to make sure their ideas are shared, so others can learn from and build on them, will help those ideas go further, be challenged and strengthened, and, in the end, do more good.