Evidence-Informed Policymaking Reading List | August 2018 — August 7, 2018

Evidence-Informed Policymaking Reading List | August 2018


Just Concluded: Peer Learning Workshop on Evidence Use

Last month we hosted a workshop for teams of government policymakers from nine countries, providing a peer learning forum to share experiences, lessons, and strategies for using evidence more effectively in the implementation of social policies. You can learn about the workshop here and read our initial insights and reflections in our latest blog post.

Final Workshop Image with @#

What to Read this Month

“Administrative data can do much more than help deliver services or provide inputs for monitoring. We can use administrative data also for learning and research in humanitarian emergencies if agencies make available their data for analysis as part of an ethical, secure and deliberate strategy.”
A look at the strengths, weaknesses, opportunities, and threats of using administrative data in the humanitarian field. The main strength of using administrative data is that it is available immediately at no cost and can be used for research and learning. A common challenge, on the other hand, is that it can be difficult to harmonize administrative data from different sources.

“Even a cursory look at the literature shows that evidence-informed policy making is about more than merely the availability of knowledge items. You have to go beyond uploading documents on a server, to also build and use high-trust relationships.”
In addition to sharing relevant, useful and reliable knowledge via its online platform, the new South African SDG Hub aims to strengthen partnerships between policy actors and researchers and support capacity building to improve the use of evidence for SDG-relevant policymaking.

“After more than a year of executing 10-week projects, they’re starting to identify city trends, and getting results: After analyzing data on the relationship between education and income, for example, they increased federal financial aid sign-ups in the city by 10 percent.”
Made up of city residents and government workers, the Urban Data Pioneers volunteer group helps collect and analyze data on the city’s population, housing, education, and infrastructure, to advance the Tulsa mayor’s campaign promise to use data to focus on results.

“Many RFPs include some type of evidence to explain the scope of the problem the solicitation seeks to address within the country context (e.g., statistics showing low school attendance, disease prevalence rates). That makes for helpful background reading, but it doesn’t get at the crux of the matter-whether the intervention USAID is soliciting will plausibly achieve the intended outcomes. Far fewer RFPs offered evidence for this.”
A thoughtful article on how USAID – and other major development donors – could restructure bidding and procurement practices to 1) incorporate evidence into its Requests for Proposals and suggested program interventions, and 2) prioritize awarding contracts to implementing partners that use evidence to inform their proposals and program designs, and demonstrate a commitment to building the body of evidence on what works and why in global development.

Our colleagues at Results for America recently published an index showcasing how state governments are using data and evidence in budget, policy, and management decisions to achieve better outcomes for their residents. Criteria include Data Policies / Agreements, Data Use, Evaluation Resources, Cost-Benefit Analysis, Use of Evidence in Grant Programs, and Contracting for Outcomes.

What We’re Working On

We’re still processing our learnings from the peer learning workshop we hosted last month on using evidence for policy implementation, and will share more insights, reflections, and takeaways in a final report in the weeks to come. We’re also preparing a satellite session at the AEN Evidence conference in Pretoria next month, where we’ll discuss our learnings from the workshop and other activities from the past year, and present a strategy for a peer learning network to support policymakers in advancing the use of evidence in government.

Do you have comments, questions, or ideas for us? We always love hearing from you! Contact us at info@results4all.org anytime.

Where do policymakers go when they need a safe space to engage with each other on evidence use in policy implementation? — August 6, 2018

Where do policymakers go when they need a safe space to engage with each other on evidence use in policy implementation?

They come together in a workshop where they can share experiences, lessons, and strategies for using evidence more effectively in the implementation of social policies.


On July 23-25, Results for All and our partners at AFIDEP and IDinsight hosted “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” in Nairobi, Kenya, providing such a space for ten teams of evidence champions and policymakers from nine countries: Chile, Ghana, Kenya, Malawi, Mexico, Nigeria, Rwanda, South Africa, and Uganda. Each government team is tasked with implementing a specific social policy, such as increasing the quality of public education, meeting family planning targets, and supporting the most vulnerable households, and sought to use this opportunity to inform that work. To learn more, take a look at the workshop agenda, and read about the participating teams and the policies they are working on in this set of short policy briefs we wrote together. You can also see photos from the workshop and several video interviews with participants on our Twitter.

Final Workshop Image

Why a focus on policy implementation?

Over the last few months, we’ve been engaged in a series of consultations to assess the demand for a global evidence network and understand how funders are supporting evidence use to inform government priorities. A consistent theme in our conversations has been the lack of attention given to policy implementation or translation of policy to action. We also heard from many participants that the specific focus on evidence use in policy implementation is what drew them to apply for and participate in the workshop.

Policy implementation challenges can occur due to a myriad of factors, including unclear policy goals and outcomes; an absence of political support or financial resources; missing or weak evidence on the effectiveness of an intervention; inadequate skills or motivation among public officials tasked with frontline service delivery; and incorrect assumptions about human behavior and local needs. Addressing these implementation challenges requires a variety of evidence: evidence on how to mobilize political and financial support for the policy; evidence on whether the policy has worked elsewhere and under what conditions; evidence on how to enable and incentivize frontline agents to best implement and track the policy; and evidence from local stakeholders to best tailor the policy to their context and needs.

When implementation, along with monitoring and evaluation activities, is not linked to policy design but instead treated as a distinct down-stream activity, the incentive to produce evidence in an ongoing and iterative process to inform policy is weak. This puts evidence-informed policymaking at risk. Policymakers can only ensure the benefits of evidence-informed policymaking when implementation succeeds. We think it is critical therefore, for governments to take a systematic and structured approach to using evidence to bridge the gap between policy design and implementation, to achieve better results for the people they represent and serve.

Initial insights and reflections

We’re still processing our learnings from the workshop and will share a lot more in the weeks to come, but here are some initial takeaways.

1: Common evidence use challenges persist across diverse contexts

We were not surprised to learn of the many common challenges that workshop participants face in using evidence to inform policy implementation, regardless of the specific policy, sector, or country context. These include the lack of a learning and results-oriented evaluation culture; the difficulties associated with integrating and using data across the multitude of agencies working to address complex social problems; the challenge of turning raw data into useable information; the absence of structured partnerships with the research community and media; and a lack of tools and understanding on not only how to engage with citizens, but importantly on how to use the inputs that they provide to improve policy implementation.

2: A safe space for sharing challenges, experiences, and accomplishments is attractive to policymakers, even those with more advanced evidence use

In our consultations we also heard a lot of interest in peer learning and networking between governments, and we received a lot of interest in this workshop due, we think, to the fact that peer learning was a central theme. We wanted to test whether policymakers from a diverse set of countries, most of whom had never met before, could connect over common missions and challenges, openly share their successes and failures, and provide real value to each other’s work. The verdict thus far is yes: participants have told us over and over what a great opportunity it was to connect with others in government, and to now have a network of international peers with whom they can discuss ideas and share resources. We confess that we were skeptical that representatives from some countries – especially Mexico, Chile, and South Africa, with their very advanced evaluation systems – would benefit from being in the room as much as others – but participants have reiterated that the workshop gave them a lot to think about and apply to their work. Some noted that this was a great opportunity to showcase their country’s learning and growth around evidence use, and that they were interested in more forums that provided this platform.

3: Participants want more practical, hands-on tools with immediate applications to their work

Workshop participants were especially keen to use tools, like checklists, behavioral insights tools, and design thinking, that helped them reflect on their own experiences, map out connections, and chart a way forward. This interest signals the potential for future network activities focused on jointly developing model policies, frameworks, or guidelines for evidence production and use, which participants could then adapt to their own contexts.


4: Connecting government and civil society is a valuable function for a network or community of practice

We designed this workshop as a forum primarily for government policymakers to learn from each other. However, one of the most popular sessions was a ‘marketplace of citizen engagement solutions,’ where we invited eight non-governmental organizations from Nairobi to set up booths and showcase their work to participants. The organizations – Africa’s Voices Foundation, Code for Africa, Local Development Research Institute, Map Kibera Trust, Muungano wa Wanavijiji, Open Institute, Twaweza East Africa, and Well Told Story – are each using technology and innovative approaches to collect and analyze citizen perspectives, feedback, and ideas in order to identify social problems, point to improvements in public programs, and spark behavior change and collective action. Participants told us they relished the opportunity to speak with these organizations and learn of new and innovative tools to collaborate with communities to collect data and source solutions, and we heard also from presenters that they appreciated the opportunity to interact with such highly engaged policymakers. Overall, this marketplace taught us that in addition to satisfying demand for peer learning among governments, a network or community of practice focused on evidence use can also be a powerful bridge between government and civil society.


5: A thematic or sector focus could deepen the conversation on institutionalizing evidence use

We did not expect to solve huge, complex social problems in a 2.5-day conference. Rather, we were interested in exploring whether we could have a meaningful conversation about strengthening institutional practices and processes for evidence use in policy implementation, across the different policies and contexts represented in the workshop, and whether this approach would be valuable to policymakers. There was general agreement that a conversation about evidence practices and processes is critical to strengthening evidence use in policy implementation and that there are many lessons to learn from different policy areas, but workshop participants also indicated that working groups with a thematic or sector focus could help provide deeper, richer insights and value to support their work. For some, this means a focus on themes like data collection, evaluation capacity, and statistical systems, while others felt that sectoral working groups could be helpful in addressing contextual factors that are specific to sectors. As a follow-up to the workshop we will be conducting a series of short surveys to engage further on the issues or themes that would be most helpful for policymakers to engage in through a network, and we will continue to shape a network strategy in collaboration with partners.

Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” was an opportunity for a global community of committed evidence champions to share real-world experiences in implementation, discuss common challenges, and collectively shine a spotlight on good practices for using evidence to improve the implementation of policies and generate results for their populations. We heard from participants and partners that the workshop was a resounding success, giving all of us new ideas and questions to take back to our work, connections and partnerships to continue to grow, and the inspiration to continue advancing the use of evidence to get results in government. We know that here at Results for All, the workshop gave us a lot to think about, and we’ll be sharing more insights, reflections, and takeaways in our forthcoming report later this month.



Evidence-Informed Policymaking Reading List | July 2018 — July 3, 2018

Evidence-Informed Policymaking Reading List | July 2018

Our Network Mapping Report is Here!
Peer learning networks are powerful platforms for bringing countries together to exchange information, learn, and collaborate. Results for All conducted research to assess whether there is demand in government for a peer learning network that could help policymakers share experiences and amplify solutions to advance the use of evidence in policy, to achieve better outcomes for citizens. To inform a network strategy and avoid potential overlap with existing initiatives, we engaged in a comprehensive process of identifying, researching, and mapping active networks that have similar missions. The subsequent report identifies and classifies 50+ networks for peer learning and evidence use in government; describes 8 modes of engaging network members; and synthesizes research and key informant interviews into 13 lessons on network organization, engagement, and measurement. It then matches select networks against 5 criteria and concludes that a new network premised on these criteria could support evidence-informed policymaking and add value to current initiatives.

We hope this report will be useful for a variety of actors seeking to support evidence-informed policymaking, and identify opportunities to enhance collaboration and fill gaps in this important field.

Results for All Network Mapping Report

What to Read this Month

“An accurate, concise and unbiased synthesis of the available evidence is arguably one of the most valuable contributions a research community can offer decision-makers.”
The article identifies four principles that can help it make it easier for evidence producers and users to commission, appraise, share, and use evidence in policy. These four principles – inclusive, rigorous, transparent, and accessible – should apply to every evidence synthesis, which if done well becomes a global public good.

“The innovative contribution of the current study is the development and validation of a set of measurable indicators specifically devoted to assess and support EIPM in the field of public health, intended to be jointly used by governmental policy-makers and researchers, but also by other stakeholders involved in various stages of the policy-making cycle.”
The article describes a Delphi study that led to the development of indicators that can be used to assess the extent to which policies are informed by evidence. The indicators cover issues related to the skill and experience of staff working on policies, documentation of evidence in policy documents, communication and participation with key stakeholders, and procedures for monitoring and evaluating evidence use in policy. The indicators could also help to encourage establishment of routine processes for advancing evidence use in policy.

“There’s a limited evidence base about knowledge brokers, but preliminary findings suggest that they do have the potential to improve the uptake of evidence.”
Insights from the Wales Center for Public Policy on the role it plays as a knowledge broker in evidence-informed policymaking – helping to  build an understanding of evidence needs and questions, improve access to evidence, promote  interaction between evidence users and producers, and strengthen capacity to engage with research. The Center is refining its theory of change to better understand approaches that work, and to take a more systematic approach to facilitating evidence use in policymaking.

“Just as a journalist is trained to tell a compelling story so that an audience’s attention is captured and held so that facts of a story can be relayed to a reader or viewer, so too do scientists or policy experts need to capture attention and communicate both the importance and complexity of issues to their audiences.”
A useful article describing how storytelling influences the policy process and offering key steps to help policy actors build a better narrative.

“Guerrero, a Harvard-educated surgeon-turned-epidemiologist, understood violence as an epidemic transmitted from person to person. As with any epidemic, he tried to map the outbreak and understand its transmission. Data came first.”
A great story about data-driven policing and violence prevention in Colombian cities.

What We’re Working On

Later this month, we’re convening teams of government policymakers from nine countries to share experiences, challenges, and lessons on how to use different types of evidence to overcome roadblocks, create political buy-in, engage with stakeholders, and mobilize financial resources to support and improve policy implementation. “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” will take place from July 23-25in Nairobi, Kenya, in partnership with the African Institute for Development Policy (AFIDEP) and IDinsight. We’ll share profiles of the participating teams and the policies they are working on in the coming weeks on Twitter (@resultsforall) so be sure to follow us there!
Evidence-Informed Policymaking Reading List | June 2018 — June 1, 2018

Evidence-Informed Policymaking Reading List | June 2018

What to Read this Month


“In contrast to theories of change that posit that more rigorous evidence will have a greater influence on officials, we have found the opposite to be true.”
Excerpts from a paper reflecting on ten years of trying to improve public health in Guatemala. A key takeaway from the paper is the more community members were involved in generating and presenting evidence, the greater the likelihood that it would be used to address service delivery challenges.


“Our argument is that to improve policy execution we must go one step further and consider how policies can be more effectively designed by connecting actors vertically and horizontally in a process of collaboration and joint deliberation.”
This article describes how collaboration in policy design can facilitate adaptive implementation of policy solutions. For example, collaboration with front line staff can help government leaders better understand challenges faced at the service delivery level and propose context specific solutions, while collaboration with citizens can generate constructive feedback that stimulates learning and incremental adjustments to a policy solution. The article also discusses how open and democratic political contexts enable collaborative policymaking.


“Neither country policy makers nor the global development community are best served by a global flood of health estimates derived from complex models as investments in country data collection, analytical capacity, and use are lagging.”
A brief commentary illustrating how the development community’s emphasis on global health predictions and estimates can contribute to a false level of certainty about health status and trends, and importantly, detract from needed investments in improving data collection and analytical capacity in countries.


“Impact evaluations are an important tool for learning about effective solutions to social problems, but they are a good investment only in the right circumstances. In the meantime, organizations must build an internal culture in which the right data are regularly collected, analyzed, and applied to manage implementation and improve programs.”
A comprehensive summary of the authors’ recent book, in which they explain why impact evaluations are not always useful, describe ten scenarios in which alternatives to impact evaluations make sense, and provide four guiding principles for collecting data that will be used to inform decisions and improve programs.


“Within a year of the research getting published, and publicised by media and labour unions, the Maharashtra government raised the minimum wage to INR 12. This benefited 6 million labourers in the state-many, many times more than what we could have hoped to achieve if we had not adopted a research-based approach to the problem.”
Reflecting on his experiences as a medical physician and researcher, Dr. Bang discusses the importance of research for development, and the difference between research on the people and research for and with the people.


“Even the simplest intervention is context dependent in countless, subtle ways – it’s impossible to say with certainty how it will fare in another place. However, as presented here, there’s a four-step framework that can disqualify many mistakes before they happen, and improve the odds of replications you pursue.”
Useful insights, examples, and a framework to determine when an intervention can be replicated in a new context: assess the evidence, understand what conditions made the intervention work in the original context, ensure that the new context has those same essential conditions, and adapt the intervention as necessary.


African Research Organizations: Submit your Proposals by June 15


The Hewlett Foundation call for proposals aims to advance government use of evidence by supporting East and West African policy research organizations. They can apply alone or lead a project with 1-2 partner organizations. See more here.


What We’re Working On


We’re busy organizing “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers,” which we are hosting in partnership with the African Institute for Development Policy (AFIDEP) and IDinsight from July 23-25 in Nairobi, Kenya.


The workshop will provide a forum for government teams from 9 countries to learn about and share experiences, challenges, good practices, and key lessons on how to use different types of evidence to overcome roadblocks, create political buy-in, engage with stakeholders, and mobilize financial resources to support and improve policy implementation. We selected 10 great teams from over 55 applications. A big thank you to everyone who applied and helped to spread the word! More info coming soon.


Evidence-Informed Policymaking Reading List | May 2018 —

Evidence-Informed Policymaking Reading List | May 2018

New Opportunity for African Research Organizations

On May 1, the Hewlett Foundation launched a call for proposals for African Policy Research Institutions to Advance Government Use of Evidence. The call is meant to support policy research organizations in East and West Africa in their efforts to advance evidence-informed policymaking, with a focus on strengthening policymakers’ capacity, motivation, processes, and relationships to enhance evidence use. African policy research organizations can apply alone or lead a project with 1-2 partner organizations. Submit clarifying questions about the call for proposals to EIPAfrica@hewlett.org by May 8. The deadline for first-round proposals is June 15. Learn more at https://www.hewlett.org/eipafrica/.
May EIP Reading List Photo

How can you ensure that evidence used to inform policy is appropriate, credible, and transparent? What types of evidence are appropriate for each stage of the policy cycle, including design and implementation? New Zealand’s Social Policy Evaluation and Research Unit, Superu, answers these questions and more in their latest guide.

“We should be more ambitious than just making ourselves (and the experts we work with) useful to the various groups we hope might use the research and policy analysis our institutions produce.”
It’s important for researchers to package and communicate information in a way that is accessible to policymakers, but it’s even more critical to invest in longer-term partnerships with policymakers to truly influence social change.

“Both making evidence accessible and facilitating processes for deliberating evidence were essential in supporting evidence users to understand the extent and usefulness of evidence and identify implications for policy, practice and services.”
This paper describes a process developed by the Centre for Research on Families and Relationships in the UK, to facilitate the use of evidence in practice. The process helped practitioners identify gaps in knowledge; facilitated an evidence review to refine the research question, find, synthesize, and report on evidence using tools and templates; and concluded with a discussion to plan for the use of evidence.

“Governments have used the phrase ‘let a thousand flowers bloom’ to symbolise a desire to entertain many models of policy-making.”
Evidence-based decision making can take different approaches, ranging from implementation science driven by RCTs, to storytelling that draws on practitioner and user feedback, to improvement methods that draw from a combination of experience and operational evidence. The challenge is knowing what type of evidence to use given the complexity and politics of the policymaking process.

“The establishment of Science and Technology (S&T) fellowship programs in other states could greatly increase evidence-based policy-making and not only benefit state policy-makers but also help to inform national policy-making and society as a whole.”
The editorial reflects on the success of California’s S&T Policy Fellows Program, which selects and trains individuals with a science or technology background to serve as full-time legislative staff for a year. The Fellows are in high demand, and as a result of the program the use of evidence in state policymaking has improved dramatically.

What We’re Working On

We’re partnering with the African Institute for Development Policy (AFIDEP) to host a peer learning workshop on Using Evidence for Policy Implementation from July 23-25 in Nairobi, Kenya. Teams of 3-4 policymakers from governments around the world will be fully-funded to participate in the workshop. Participants will work in country teams to discuss and diagnose the root causes of policy implementation challenges, create solution-based roadmaps, and exchange experiences and lessons on evidence use in policy implementation across different contexts.

There is only 1 week left to apply! Please find more details and application instructions here.


Open Call for Applications: Peer Learning Workshop on Policy Implementation — April 24, 2018

Open Call for Applications: Peer Learning Workshop on Policy Implementation

Workshop Banner

Applications are due by May 14, 2018 via email to info@results4all.org.

Download the application here.

The Workshop:

Results for All and the African Institute for Development Policy (AFIDEP) will host a two-and-a-half-day workshop for policymakers from around the world to:

  • Discuss the challenges governments face in effectively implementing policies; and
  • Share experiences and strategies for using evidence to improve policy implementation.

The workshop will:

  • Facilitate dialogue, exchange, and active engagement among participants, to more deeply understand policy implementation challenges and lessons learned in different contexts; and
  • Introduce tools and approaches for improving implementation using various types of evidence.

During the workshop, participants will seek to answer questions such as, what are the most common barriers to effective policy implementation in different government office contexts? What type of evidence is needed to unlock implementation? How and when should it be considered? What strategies and mechanisms are governments in different countries introducing to improve and integrate evidence use in policy design and implementation? How can we learn from their experiences?

Workshop Outcomes:

  • Participants will learn from and interact with peers leading policy implementation activities from 7-8 national governments.
  • Participants will work in country teams to diagnose root causes of policy implementation challenges and create solution-based roadmaps.
  • Participants will provide feedback and shape future collaboration, including a potential global network for government leaders to advance the use of evidence in public policy.

Who Should Participate?

Results for All and AFIDEP invite public officials and policymakers to form a team of three or four individuals who are working together to implement a specific policy, in any sector, and who want to learn how and when to use evidence to overcome policy implementation challenges. A team must include members from at least two government ministries / departments / agencies, and be approved by senior leadership via the signature at the end of this application. Teams from seven to eight countries will be selected for participation in the workshop.

Teams are encouraged to include:

  • A program manager or director in a ministry / department / agency, who oversees the implementation of the policy in question.
  • A public official or practitioner at the national or subnational level, who has a role in operationalizing the policy, or collecting operational data and evidence.
  • An analyst, manager, or director from a national finance or planning ministry / department, who has a coordinating role in managing or evaluating policy.
  • A technical expert from a research or evaluation unit or statistical office, who has a role in producing or sourcing evidence to inform policy options or implementation strategies.

Teams will be expected to develop a power point presentation outlining a policy implementation challenge in advance of the workshop, and to engage in follow-up activities to put roadmaps into practice.

Teams from Brazil, Colombia, Ghana, India, Indonesia, Kenya, Malawi, Mexico, Nigeria, Philippines, Rwanda, South Africa, Tanzania, Uganda, and Zambia are especially encouraged to apply.

Download the application and apply here.

Evidence-Informed Policymaking Reading List | April 2018 — April 2, 2018

Evidence-Informed Policymaking Reading List | April 2018

What to Read this Month

Advancing Evidence-Informed Policymaking: What’s Culture Got to Do With It? | Abeba Taddese, Results for All
“It takes time and intentional effort to build or change the evidence culture in government. And to truly do so, we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work.”
We often hear from policymakers and partners about the importance of building or shifting the culture of evidence use, but what does that mean? Our latest blog explores the notion of an evidence culture and how to support it.


Ian Goldman is Acting Deputy Director General and Head of Evaluation and Research in South Africa’s Department of Planning, Monitoring and Eva
luation (DPME). His presentation, made at the Africa Evidence Forum in Nairobi last month, shares challenges and lessons for evidence use in the policy process. See his plea to researchers on the last slide!


“Because knowledge mobilisation means different things to different people it can also be difficult for new knowledge mobilisers to identify and clarify their role and communicate this effectively. This increases the risk of misunderstandings and misalignment between knowledge mobilisers and those they are working with.”
This article offers a practical framework based on a series of questions that can help knowledge mobilizers better understand their role: Why is knowledge being mobilized? Whose knowledge is being mobilized? What type of knowledge is being mobilized? How is knowledge being mobilized?


“Often, innovation starts in the field, and practice gradually influences policy over time. That means it is as important for research uptake to get communities of practice to engage with research.”
Learnings from Secure Livelihood’s Research Consortium’s work in DRC emphasize the need to focus not only on decision-makers in the research uptake equation, but to also consider the role of researchers. The Consortium has learned that to improve research uptake, it is important to strengthen institutional research capabilities as well as individual research skills, and to emphasize the value of research early, for example in higher education curricula, while also building relationships with policymakers.


“Rather than making large-scale decisions in the absence of fully conclusive or satisfactory evidence, experimentation can help to “de-risk” the decision-making process at this step of the policy cycle.”
This article encourages governments to consider small-scale experiments as a means to test new approaches and move incrementally toward larger evidence-informed policies and programs. It highlights what the government of Canada is doing in this regard, and also includes a nice description of how experimentation could be incorporated in each step of the policy cycle.


“Data and evidence can overcome such myths, but try and use government data – policy makers are more comfortable when you use data they know and trust.”
The Private Secretary to former President Kikwete in Tanzania shares insights on who to target, when to act, and how to communicate the message to improve your policy advocacy.


What We’re Working On


We’re putting the finishing touches on our network mapping report, in which we assess the demand for a global evidence network, and review 50+ existing networks for peer learning and evidence use in government. We also synthesize 13 lessons on network organization, engagement, and measurement. Thank you to everyone who shared insights and feedback! We look forward to publishing the report in the next few weeks.
Advancing evidence-informed policymaking: What’s culture got to do with it? — March 29, 2018

Advancing evidence-informed policymaking: What’s culture got to do with it?

Over the last few months my team at Results for All has been engaged in consultations to assess the demand for a new global evidence network that could bring government policymakers together to exchange innovative ideas and learn from each other to advance evidence use in policymaking.

We have spoken to policymakers in government, members of the research and academic community, as well as non-governmental partners and initiatives in countries including Colombia, Chile, Finland, Nigeria, South Africa, Tanzania, Uganda, and Zimbabwe, among many others. In every conversation, we heard about the importance of building or shifting the culture of evidence use. While we expect and assume that organizational culture will be different in varied contexts, we observed an interesting tendency in the policymaking community to speak about culture and evidence use in a way that suggested some universality across policy areas and levels of government. We noted further that in the context of evidence use, culture was often spoken of in broad and vague terms, such as “the culture is not developed enough,” “there is no culture of producing data,” or “mid-level technocrats have a lot of influence, and the ability to shift government culture.”

We are curious about the notion of an evidence use culture in government, and believe it is essential to better understand this culture so we can identify strategies to help strengthen evidence use in government. 

What is culture?

The challenge in understanding what a culture of evidence use in government looks like begins with the definition of culture itself, a term with many meanings. The first of Merriam Webster’s six definitions for culture describes it as a set of attitudes, values, goals, and practices shared across an institution or organization. Matsumoto et al suggest that while attitudes, values, and goals can be shared by a group, they can also be differentiated at an individual level.

This practical guide on changing culture developed by Bloomberg Philanthropies’ What Works Cities initiative offers a definition of culture that gets at norms: “culture is the difference between what you tolerate and don’t tolerate.” According to the guide, culture embodies interactions between the different elements of a system such as people, beliefs, values, and attitudes. It is both causal and dependent on an organization’s knowledge, processes, and systems. It is not a singular thing – an individual or organization can be defined by multiple cultures. And it is both learned and a legacy that can be shaped over time. These conflicting and dynamic elements are what make culture hard to define.

Levels of culture

To understand culture as it relates to evidence use in government, it is helpful to explore the different levels in which culture presents itself in an organization. This includes artifacts, values, and assumptions, captured in a helpful visual here.

The visible and tangible elements of an organization are its artifacts. They are what you see when you walk into an office – desks, chairs, computers, plants, and filing systems. Reports, briefs, databases, and knowledge management systems are also types of artifacts. Artifacts can give a sense of office culture – we might for example, assume that a brightly colored office with an open floor plan has a creative mission, and sense entrenched bureaucracy in a dark, traditionally furnished office. Or we might expect an office with the technology for collecting and storing data, to routinely use evidence to inform policy and programs.

Yet these visual cues about an office’s culture may be misleading if we do not understand the organization’s values and the underlying assumptions that drive the daily work of its leaders and employees. For example, a government office may have the relevant evidence artifacts such as a knowledge management system or evaluations, but lack shared values to guide and encourage evidence use in decision making. But even when there are tangible artifacts, and a government office publicly articulates the value of using evidence in policymaking, if the underlying assumption is that using evidence is too costly or time consuming, the office is unlikely to translate its artifacts and values to systematic use of evidence in policy decisions. The challenge is that it can be hard to uncover hidden assumptions – feelings, perceptions, thoughts, or beliefs – that shape an organization’s visible artifacts and values. Artifacts and values can also be disconnected and even contradictory, most noticeably in government when financial commitments needed to support desired policies or policymaker behavior do not line up with a government’s stated values.

In the context of evidence-informed policymaking, it is important to build artifacts – the systems and processes governments need to ensure evidence is appropriately sourced and used to inform strategic thinking, policy development, implementation of policy options, and monitoring and evaluation. It is also critical to build and instill a shared and publicly expressed value in using evidence. But to influence behavior change and shift attitudes about evidence use, it is imperative that we consider the basic assumptions that guide how work is done and decisions are made. When what we say (reflecting values) does not align with how we behave (building and using artifacts), it is a sign that we need to dig deeper to understand the assumptions that govern our behavior

What should governments do to strengthen underlying assumptions and shift the culture toward evidence use?

    1. Take time to know the office – For many government offices, a conversation to understand barriers and challenges that inhibit evidence use, and clarify performance expectations and intended outcomes of policies, is a good starting point for those who would like to see greater use of evidence in policymaking. Build the communications skills to hold these conversations. A needs assessment can help to diagnose the gaps in knowledge, awareness, and capacity that can influence assumptions around what it takes to find, understand, and use evidence.
    2. Invest in leaders and champions – Strong role models who demonstrate the importance of using evidence through their actions can inspire others and help to change behavior patterns. Highlighting respected leaders who support innovation, change, and learning can positively influence other public officials’ assumptions and attitudes toward evidence use.
    3. Build knowledge and awareness – Policymakers who are confident in their ability to find, appraise, and synthesize evidence, and who understand the complexities of the policymaking process, are more likely to use evidence in their decision making process. Training courses or events such as dedicated research weeks can raise awareness about the value of using evidence and change assumptions that using evidence is too intimidating or complex.
    4. Create a compelling narrative – Ruth Levine gets at a moral argument for evidence-informed policymaking here and here. Moving from a compliance and monitoring mindset to a compelling narrative that points to failed outcomes for citizens when we do not use evidence can be a way to shift attitudes and behavior toward evidence use. Make responsible allocation and spending of limited government resources about doing right by citizens – achieving healthier populations, delivering quality education for all, accelerating financial empowerment for women.
    5. Promote networks and relationships – Whether formal or informal, peer interactions can help policymakers strengthen technical skills and shift attitudes and assumptions by exposing them to new ideas. As an organization, this could mean giving staff the time and space to connect with each other to share information, lessons, and experiences.
    6. Recognize and reward desired behavior – Different strategies can be used to motivate policymakers to use evidence in decision making, ranging from financial performance incentives to less resource-intensive award and recognition programs. Governments can use these strategies to promote and reward desired behavior, nudging policymakers to shift their assumptions and actions to align with organizational values.

It takes time and intentional effort to build or change the evidence culture in government. And to truly do so, we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work. These assumptions determine whether values become empty statements and artifacts gather dust or, ultimately, whether evidence use becomes a cultural norm.

Abeba Taddese is the Executive Director of Results for All, a global initiative of Results for America.

Evidence-Informed Policymaking Reading List | March 2018 — March 5, 2018

Evidence-Informed Policymaking Reading List | March 2018

What to Read this Month

“Although the awareness and interest in evidence-informed policymaking has gained momentum in Nigeria, meeting points, such as policymakers’ engagement events to consider issues around the research-policy interface related to MNCH, are essentially lacking.”
This article summarizes the barriers to and facilitators of evidence-informed health policymaking in Nigeria and highlights a need to strengthen capacity at the individual and organizational levels (skills, systems, and processes) to facilitate the evidence to policy process.
“However, while there is a problem when civil servants don’t understand statistics or how to weigh up which evidence sources are reliable, fixing these problems won’t help improve policy if there is no political space to bring evidence into decision making, and no incentives for senior decision makers to care about evidence.”
The final, independent evaluation of the DFID Building Capacity to Use Research Evidence (BCURE) program found that partners were more successful when they went beyond building individual skills and capacities, and engaged with politics and incentives to create changes in the organization and wider environment.
“Many governments collect vast amounts of data, but need support to organize it in a usable format and identify use cases to improve program management.”
Through its government partnerships in Brazil, Chile, Colombia, and Peru, J-PAL has found that in addition to great leadership, the following strategies help support evidence-informed policymaking in government: making it someone’s job to use evidence, helping governments make better use of the data they already collect, investing in long-term partnerships, and taking on quick wins that can build trust and demand for evidence.
“Policy narratives are ‘the communication vehicles that are used for conveying and organizing policy information’, and the development of policy can be understood as a ‘narrative-making’ process.”
This article describes how education policymakers in Australia commonly referenced policy narratives when describing evidence use in their policy development process, and how these narratives in turn helped the research team better make sense of the different ways in which policymakers were using evidence. The article suggests that examining how evidence informs construction, testing, and communication of policy stories or narratives may be a promising and less abstract approach for understanding evidence use in policy development given how prominently narratives and stories feature in policymaking.
“There is an increasing need to develop faster and more reliable learning processes to solve problems and, at the same time, strengthen the trust between citizen and policy institutions. Policy experiments are a great way to do this.”
In this interview, Mikko Annala, Head of Governance Innovation at Demos Helsinki, a Finnish independent think tank, shares three elements that are key to building a culture of policy experimentation: incentives that mandate and support evidence production, experimentation, and failure; committed leadership; and a focus on getting results.
And the results are in! Last month, we used this A/B testing tool from ideas42 to test two different subject lines for this monthly email to see which one got a higher open rate (the percentage of people receiving the email who actually opened it).
  • Version A: Evidence-Informed Policymaking Reading List | February 2018
  • Version B: Which program works best? How do evidence maps inform policy? Is results-based aid more effective?
The Version A and B email blasts had open rates of 25.2% and 20.1%, respectively. The good news is that both rates are higher than the industry average for nongovernmental organizations! However, although Version A looks more effective than Version B, with a p-value of 0.156, the results were not significant at the 10% level. To yield a statistically significant result (to be more confident about which subject line attracts more interest) we could try this test again with a larger sample size. That means we need to send our reading list to more people! If you know anyone who would like to receive this monthly email on evidence-informed policymaking news and research, please forward this message and encourage them to sign up here: https://results4america.org/results-for-all-reading-list/


Thanks for reading!
Evidence-Informed Policymaking Reading List | February 2018 — February 7, 2018

Evidence-Informed Policymaking Reading List | February 2018

What to Read this Month

Enhancing Evidence-Informed Decision Making: Strategies for Engagement Between Public Health Faculty and Policymakers in Kenya | Nasreen Jessani et al, Evidence & Policy
“It would behove policymakers in the Kenyan government to lead the change with respect to outreach and inclusion of academic researchers in policy deliberation.”
This article explores the interactions between academic knowledge brokers and health policymakers, and concludes that a combination of personal relationships and institutional partnerships are needed to facilitate engagement.

Is Results-Based Aid More Effective than Conventional Aid? Evidence from the Health Sector in El Salvador | Pedro Bernal et al, Inter-American Development Bank
“Using a difference-in-difference approach and national health systems data we find that preventive health services increased by 19.8% in conventional aid municipalities and by 42% in RBA [results-based aid] municipalities compared to national funds, suggesting that the results-based conditionality roughly doubled aid effectiveness.”
This study is one of the first to measure the effects of results-based financing on the quality of public services provided by government municipalities.

Policy Relevant Evidence Maps: A Method to Inform Decision Making in the Public Sector (Webinar) | Carin van Zyl & Laurenz Langer, GESI
“DPME’s policy-relevant evidence maps follow a rigorous and transparent research process.”
This webinar outlines the steps used in South Africa’s Department of Planning, Monitoring, and Evaluation (DPME) to construct an evidence map to inform decision making in the area of human settlements, and how the process can be adapted elsewhere.

The What Works Network Five Years On | UK Government
“We have hugely talented public sector leaders, but we can still do more to make the best evidence available to them, and to ensure that the time and money invested in our public services are used to the best possible effect.”
In the last five years, the ten What Works Centers in the UK have produced or commissioned 288 evidence reviews used to improve public services; other activities highlighted in this report include publicizing evidence gaps, creating evidence comparison toolkits, and conducting evidence use audits of government departments.

Increasing the Use of Data and Evidence in Real-World Policy: Stories from J-PAL’s Government Partnership Initiative | Samantha Carter & Claire Walsh, J-PAL
“When governments decide to use data and evidence to improve policy, the results can be powerful.”
During its first two years, J-PAL’s Government Partnership Initiative (GPI) has supported 28 partnerships in 15 countries, helping to scale-up effective programs, and improve systems for data and evidence use.

Unsure which version of a program works best? Check out this A/B testing tool from ideas42
“A/B testing works by using an experimental procedure that provides different versions of parts of a program – such as a letter, a web site, or step in the process – to people at random. Statistical analysis can confirm which version is working better and by how much.”
Businesses constantly experiment with A/B testing to refine products and services and how they are marketed, but increasingly, governments are using the strategy to identify which of two service options citizens prefer, and how to communicate messages and create behavior change. This tool can help you prepare two versions to test and what to measure, randomly assign the versions between two groups, and analyze the results to determine which version works best.