TripAdvisor, Yelp and Amazon have made profitable businesses out of crowdsourcing information and opinions, while FixMyStreetU-Report and Map Kibera have barely made it out of the starting gate in their efforts to channel citizen pressure to improve public services, like health, education, water and sanitation. Are there lessons that private digital review platforms can teach organizations working to increase social good?

That’s what we’re interested in finding out – and we’re hoping you can help us shape a research initiative that will help us, and the broader field, learn. Here, we share some background on the promise and skepticism around using digital review platforms to improve public services, obstacles to their potential impact, and ideas of how to move ahead.

Then it will be your turn.  If you have thoughts about the questions we’re considering, or how they can be answered, we want to hear from you.  Please share your thoughts with us on our Facebook page, via Twitter with the mention @hewlett_found and hashtag #HFbehave, or by sending an email to We will feature the best comments and suggestions in a future essay.

The promise: Citizen monitoring can work

We know that the quality of public services like health, education, water and sanitation can improve in certain contexts simply by citizens monitoring their performance. For example, in Uganda, where it’s estimated that teachers are absent from the classroom 26% of the time, a mobile monitoring application led to a 10% improvement in teacher attendance. In India where teachers were paid based on attendance, taking daily photographs of the classroom increased teacher attendance from 58% to 79% and led to higher test scores for students. (There are also studies that have found little or no impact of citizen monitoring, such as in Indonesia where citizen monitoring had less of an impact on reducing corruption than official government audits.)

We also know that review and ranking sites like Yelp, TripAdvisor, Angie’s List and Google Maps have transformed the relationship between consumers and the restaurants, hotels, and services they patronize. As a result, we now expect a higher level of responsiveness, transparency and customer service from businesses. With enough users actively reading and contributing to reviews on Yelp, for example, restaurant owners have an incentive to provide as much information as possible: their hours, menu, whether they have Wi-Fi or air conditioning. They know that each piece of information could determine whether they attract a new customer. A photograph of any plate of food could end up on Yelp and persuade or dissuade a potential customer.

The preliminary studies about the impact of citizen monitoring of public services combined with the rapid growth of review and ranking platforms like Yelp and TripAdvisor have generated much enthusiasm about the potential impact of review sites for public services like hospitals, schools, water, sanitation and electricity — as well as other forms of civic engagement. At a time of declining support for democracy and distrust in institutions, some observers point to this form of “monitorial citizenship” as a more realistic means to improve our communities.

Most countries now have some version of GreatSchools, a Yelp-like platform that provides parents and policymakers information about individual schools and gives them a place to leave their own reviews. There are hundreds of similar, service-specific websites around the world for hospitalselectricity providerswater providers, even toilets.

An increasing number of cities are launching “311 platforms” as one stop shops to allow citizens to complain and comment about a variety of public services. New York City’s NYC311 platform, for example, allows citizens to dial one single number 24 hours a day and speak to a customer service agent in over 180 languages. Similar platforms have attracted attention and enthusiasm in Bolivia, Pakistan and Indonesia.

There is much theory as to how to these platforms may improve the quality of public services, but little evidence. In theory, the platforms could increase trust between service users and service providers, leading service providers to make more money.

For example, more Kenyans might be willing to pay for a costly connection to a water utility if they know they’ll receive quality customer service and transparent pricing via the Maji Voice platform. Service providers may genuinely care about their reputation and wish to point to high rankings and positive reviews as endorsements of their performance. The information could lead to pressure from regulators, journalists, and watchdog groups that investigate the complaints of citizens. The platforms could turn citizens into activists by connecting individuals that face common problems and encouraging them to join advocacy movements that address them. And the platforms could simply provide useful information to inform the decisions of policymakers; if enough people map the locations of public toilets using Flushd, for example, then a city’s sanitation department could use that information to prioritize where they build new toilets based on demand and availability.

The problem: Lack of engagement

What do we know about whether the above theories play out in practice? Not much. The vast majority of service monitoring platforms die a slow death two or three years after they launch. With few exceptions, most platforms fail to engage a significant number of users, keep them coming back, and increase the diversity of users over time.

Why do digital monitoring platforms that allow citizens to review public services not attract as much participation as review sites for private services? And can anything be done to recruit more users, more diverse users, and more ongoing usage?

For private sector platforms like Yelp and TripAdvisor, their success depends on finding answers to these questions. They have teams of researchers that test every aspect of the user experience to optimize for increased user engagement. In Silicon Valley, an entire industry known as “growth hacking” has emerged to understand what keeps us coming back to websites and apps. But when it comes to similar websites that monitor public services, we have many more questions than answers.

For example, why are users of civic reporting platforms like FixMyStreet fairly gender balanced in the U.S., but overwhelmingly male in the U.K., Kenya and South Africa? More importantly, what can be done about it? There may be some relevant insights from Uganda. When ACODE launched a text-based platform called “Get Involved!” to encourage Ugandan citizens to communicate with their local councilors, they found that more men were participating more than women. However, when they included the citizen’s name in the message, adding a sense of personalization, overall participation increased and the gender gap was narrowed significantly. We don’t mean to imply that simply using a citizen’s name will address all the complexities of gender in shaping political agency, but it’s a simple insight that might be helpful to other platforms.

What can be done to connect and mobilize citizens that are facing the same problems with government services? In 2016, Tiago Peixoto and Jonathan Fox published a review of 21 digital monitoring platforms of public services in 17 countries. They found that a number of platforms were successful in resolving individual complaints, such as “my electricity isn’t working” but few platforms facilitate collective action to address common, systemic issues.

Their finding was affirmed in a survey of users of UNICEF’s civic reporting app, U-Report. 66% of U-Reporters in Uganda are male and the most active users are the also the most recent, which suggests the platform struggles to keep users engaged over time. One of the reasons could be a lack of social connection. In-depth interviews with active users of the platform emphasize a strong desire to connect with other U-Reporters. Fortunately, it would be relatively easy to test the experiences of one set of users that have access to more social features and another set that does not.

In our conversations with researchers and administrators of civic reporting platforms we’ve heard many more interesting questions:

  • How do you encourage passive viewers of information to become active reporters? What is the role of identity and social influence?
  • What is the optimal level of data collection to improve decision making and how is it most effectively filtered and presented?
  • What are the most important “signals” in the data provided to decision makers?
  • How accurate and representative are reports of the experiences of users of a particular service? What can be done to make them more so?
  • How to personalize experiences and incentives for different types of users, including different ranks of government officials?
  • Do service monitoring platforms serve as a gateway to other forms of civic engagement?


The proposal: Help us figure out solution

We believe there are many more questions facing public service monitoring websites and various hypotheses that could be tested. Unlike private sector platforms that can hire the best researchers, most public service monitoring platforms get by on a shoestring budget. They don’t have the resources to iteratively test approaches to increase the engagement by citizens and responsiveness by service providers.

So here’s our proposal for your feedback: we’re interested in supporting a research initiative that strengthens the research capacity of monitoring platforms run by governments, nonprofits and companies — and yields insights that attract more effective, diverse and ongoing engagement among users. We’re still in the preliminary stages of thinking this through and we could use your help. In our minds, the initiative would have the following goals:

  • Identify a common research agenda among platforms that review, rank and monitor public services
  • Apply behavioral insights, including insights from the private sector, to increase effective engagement by citizens and timely responsiveness by service providers
  • Strengthen the research and technical capacity of monitoring platforms to implement rapid experiments through A/B testing, surveys, and controlled experiments
  • Scope emerging opportunities and challenges faced by monitoring platforms, such as validation of reports, distinguishing between urgent versus systemic issues, the effects of gamification, automated geo-coding, privacy implications, and using sensors and image recognition to complement information from citizens


Here’s what we’d like to, ahem, crowdsource from you:

  • Is this a good use of philanthropic funds to yield actionable learnings about active citizenship for improved service delivery? If not, what else should we consider?
  • Is there relevant research we should be aware of?
  • What are the risks of funding such an initiative and how can we mitigate those risks?
  • Anything else we should consider?


Your input will help inform whether and how we issue a request for proposals. We look forward to your feedback on our Facebook page, via Twitter with the mention @hewlett_found and hashtag #HFbehave, or by sending an email to