Open Call for Applications: Peer Learning Workshop on Policy Implementation — April 24, 2018

Open Call for Applications: Peer Learning Workshop on Policy Implementation

Workshop Banner

Applications are due by May 14, 2018 via email to

Download the application here.

The Workshop:

Results for All and the African Institute for Development Policy (AFIDEP) will host a two-and-a-half-day workshop for policymakers from around the world to:

  • Discuss the challenges governments face in effectively implementing policies; and
  • Share experiences and strategies for using evidence to improve policy implementation.

The workshop will:

  • Facilitate dialogue, exchange, and active engagement among participants, to more deeply understand policy implementation challenges and lessons learned in different contexts; and
  • Introduce tools and approaches for improving implementation using various types of evidence.

During the workshop, participants will seek to answer questions such as, what are the most common barriers to effective policy implementation in different government office contexts? What type of evidence is needed to unlock implementation? How and when should it be considered? What strategies and mechanisms are governments in different countries introducing to improve and integrate evidence use in policy design and implementation? How can we learn from their experiences?

Workshop Outcomes:

  • Participants will learn from and interact with peers leading policy implementation activities from 7-8 national governments.
  • Participants will work in country teams to diagnose root causes of policy implementation challenges and create solution-based roadmaps.
  • Participants will provide feedback and shape future collaboration, including a potential global network for government leaders to advance the use of evidence in public policy.

Who Should Participate?

Results for All and AFIDEP invite public officials and policymakers to form a team of three or four individuals who are working together to implement a specific policy, in any sector, and who want to learn how and when to use evidence to overcome policy implementation challenges. A team must include members from at least two government ministries / departments / agencies, and be approved by senior leadership via the signature at the end of this application. Teams from seven to eight countries will be selected for participation in the workshop.

Teams are encouraged to include:

  • A program manager or director in a ministry / department / agency, who oversees the implementation of the policy in question.
  • A public official or practitioner at the national or subnational level, who has a role in operationalizing the policy, or collecting operational data and evidence.
  • An analyst, manager, or director from a national finance or planning ministry / department, who has a coordinating role in managing or evaluating policy.
  • A technical expert from a research or evaluation unit or statistical office, who has a role in producing or sourcing evidence to inform policy options or implementation strategies.

Teams will be expected to develop a power point presentation outlining a policy implementation challenge in advance of the workshop, and to engage in follow-up activities to put roadmaps into practice.

Teams from Brazil, Colombia, Ghana, India, Indonesia, Kenya, Malawi, Mexico, Nigeria, Philippines, Rwanda, South Africa, Tanzania, Uganda, and Zambia are especially encouraged to apply.

Download the application and apply here.

Evidence-Informed Policymaking Reading List | April 2018 — April 2, 2018

Evidence-Informed Policymaking Reading List | April 2018

What to Read this Month

Advancing Evidence-Informed Policymaking: What’s Culture Got to Do With It? | Abeba Taddese, Results for All
“It takes time and intentional effort to build or change the evidence culture in government. And to truly do so, we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work.”
We often hear from policymakers and partners about the importance of building or shifting the culture of evidence use, but what does that mean? Our latest blog explores the notion of an evidence culture and how to support it.


Ian Goldman is Acting Deputy Director General and Head of Evaluation and Research in South Africa’s Department of Planning, Monitoring and Eva
luation (DPME). His presentation, made at the Africa Evidence Forum in Nairobi last month, shares challenges and lessons for evidence use in the policy process. See his plea to researchers on the last slide!


“Because knowledge mobilisation means different things to different people it can also be difficult for new knowledge mobilisers to identify and clarify their role and communicate this effectively. This increases the risk of misunderstandings and misalignment between knowledge mobilisers and those they are working with.”
This article offers a practical framework based on a series of questions that can help knowledge mobilizers better understand their role: Why is knowledge being mobilized? Whose knowledge is being mobilized? What type of knowledge is being mobilized? How is knowledge being mobilized?


“Often, innovation starts in the field, and practice gradually influences policy over time. That means it is as important for research uptake to get communities of practice to engage with research.”
Learnings from Secure Livelihood’s Research Consortium’s work in DRC emphasize the need to focus not only on decision-makers in the research uptake equation, but to also consider the role of researchers. The Consortium has learned that to improve research uptake, it is important to strengthen institutional research capabilities as well as individual research skills, and to emphasize the value of research early, for example in higher education curricula, while also building relationships with policymakers.


“Rather than making large-scale decisions in the absence of fully conclusive or satisfactory evidence, experimentation can help to “de-risk” the decision-making process at this step of the policy cycle.”
This article encourages governments to consider small-scale experiments as a means to test new approaches and move incrementally toward larger evidence-informed policies and programs. It highlights what the government of Canada is doing in this regard, and also includes a nice description of how experimentation could be incorporated in each step of the policy cycle.


“Data and evidence can overcome such myths, but try and use government data – policy makers are more comfortable when you use data they know and trust.”
The Private Secretary to former President Kikwete in Tanzania shares insights on who to target, when to act, and how to communicate the message to improve your policy advocacy.


What We’re Working On


We’re putting the finishing touches on our network mapping report, in which we assess the demand for a global evidence network, and review 50+ existing networks for peer learning and evidence use in government. We also synthesize 13 lessons on network organization, engagement, and measurement. Thank you to everyone who shared insights and feedback! We look forward to publishing the report in the next few weeks.
Advancing evidence-informed policymaking: What’s culture got to do with it? — March 29, 2018

Advancing evidence-informed policymaking: What’s culture got to do with it?

Over the last few months my team at Results for All has been engaged in consultations to assess the demand for a new global evidence network that could bring government policymakers together to exchange innovative ideas and learn from each other to advance evidence use in policymaking.

We have spoken to policymakers in government, members of the research and academic community, as well as non-governmental partners and initiatives in countries including Colombia, Chile, Finland, Nigeria, South Africa, Tanzania, Uganda, and Zimbabwe, among many others. In every conversation, we heard about the importance of building or shifting the culture of evidence use. While we expect and assume that organizational culture will be different in varied contexts, we observed an interesting tendency in the policymaking community to speak about culture and evidence use in a way that suggested some universality across policy areas and levels of government. We noted further that in the context of evidence use, culture was often spoken of in broad and vague terms, such as “the culture is not developed enough,” “there is no culture of producing data,” or “mid-level technocrats have a lot of influence, and the ability to shift government culture.”

We are curious about the notion of an evidence use culture in government, and believe it is essential to better understand this culture so we can identify strategies to help strengthen evidence use in government. 

What is culture?

The challenge in understanding what a culture of evidence use in government looks like begins with the definition of culture itself, a term with many meanings. The first of Merriam Webster’s six definitions for culture describes it as a set of attitudes, values, goals, and practices shared across an institution or organization. Matsumoto et al suggest that while attitudes, values, and goals can be shared by a group, they can also be differentiated at an individual level.

This practical guide on changing culture developed by Bloomberg Philanthropies’ What Works Cities initiative offers a definition of culture that gets at norms: “culture is the difference between what you tolerate and don’t tolerate.” According to the guide, culture embodies interactions between the different elements of a system such as people, beliefs, values, and attitudes. It is both causal and dependent on an organization’s knowledge, processes, and systems. It is not a singular thing – an individual or organization can be defined by multiple cultures. And it is both learned and a legacy that can be shaped over time. These conflicting and dynamic elements are what make culture hard to define.

Levels of culture

To understand culture as it relates to evidence use in government, it is helpful to explore the different levels in which culture presents itself in an organization. This includes artifacts, values, and assumptions, captured in a helpful visual here.

The visible and tangible elements of an organization are its artifacts. They are what you see when you walk into an office – desks, chairs, computers, plants, and filing systems. Reports, briefs, databases, and knowledge management systems are also types of artifacts. Artifacts can give a sense of office culture – we might for example, assume that a brightly colored office with an open floor plan has a creative mission, and sense entrenched bureaucracy in a dark, traditionally furnished office. Or we might expect an office with the technology for collecting and storing data, to routinely use evidence to inform policy and programs.

Yet these visual cues about an office’s culture may be misleading if we do not understand the organization’s values and the underlying assumptions that drive the daily work of its leaders and employees. For example, a government office may have the relevant evidence artifacts such as a knowledge management system or evaluations, but lack shared values to guide and encourage evidence use in decision making. But even when there are tangible artifacts, and a government office publicly articulates the value of using evidence in policymaking, if the underlying assumption is that using evidence is too costly or time consuming, the office is unlikely to translate its artifacts and values to systematic use of evidence in policy decisions. The challenge is that it can be hard to uncover hidden assumptions – feelings, perceptions, thoughts, or beliefs – that shape an organization’s visible artifacts and values. Artifacts and values can also be disconnected and even contradictory, most noticeably in government when financial commitments needed to support desired policies or policymaker behavior do not line up with a government’s stated values.

In the context of evidence-informed policymaking, it is important to build artifacts – the systems and processes governments need to ensure evidence is appropriately sourced and used to inform strategic thinking, policy development, implementation of policy options, and monitoring and evaluation. It is also critical to build and instill a shared and publicly expressed value in using evidence. But to influence behavior change and shift attitudes about evidence use, it is imperative that we consider the basic assumptions that guide how work is done and decisions are made. When what we say (reflecting values) does not align with how we behave (building and using artifacts), it is a sign that we need to dig deeper to understand the assumptions that govern our behavior

What should governments do to strengthen underlying assumptions and shift the culture toward evidence use?

    1. Take time to know the office – For many government offices, a conversation to understand barriers and challenges that inhibit evidence use, and clarify performance expectations and intended outcomes of policies, is a good starting point for those who would like to see greater use of evidence in policymaking. Build the communications skills to hold these conversations. A needs assessment can help to diagnose the gaps in knowledge, awareness, and capacity that can influence assumptions around what it takes to find, understand, and use evidence.
    2. Invest in leaders and champions – Strong role models who demonstrate the importance of using evidence through their actions can inspire others and help to change behavior patterns. Highlighting respected leaders who support innovation, change, and learning can positively influence other public officials’ assumptions and attitudes toward evidence use.
    3. Build knowledge and awareness – Policymakers who are confident in their ability to find, appraise, and synthesize evidence, and who understand the complexities of the policymaking process, are more likely to use evidence in their decision making process. Training courses or events such as dedicated research weeks can raise awareness about the value of using evidence and change assumptions that using evidence is too intimidating or complex.
    4. Create a compelling narrative – Ruth Levine gets at a moral argument for evidence-informed policymaking here and here. Moving from a compliance and monitoring mindset to a compelling narrative that points to failed outcomes for citizens when we do not use evidence can be a way to shift attitudes and behavior toward evidence use. Make responsible allocation and spending of limited government resources about doing right by citizens – achieving healthier populations, delivering quality education for all, accelerating financial empowerment for women.
    5. Promote networks and relationships – Whether formal or informal, peer interactions can help policymakers strengthen technical skills and shift attitudes and assumptions by exposing them to new ideas. As an organization, this could mean giving staff the time and space to connect with each other to share information, lessons, and experiences.
    6. Recognize and reward desired behavior – Different strategies can be used to motivate policymakers to use evidence in decision making, ranging from financial performance incentives to less resource-intensive award and recognition programs. Governments can use these strategies to promote and reward desired behavior, nudging policymakers to shift their assumptions and actions to align with organizational values.

It takes time and intentional effort to build or change the evidence culture in government. And to truly do so, we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work. These assumptions determine whether values become empty statements and artifacts gather dust or, ultimately, whether evidence use becomes a cultural norm.

Abeba Taddese is the Executive Director of Results for All, a global initiative of Results for America.

Evidence-Informed Policymaking Reading List | March 2018 — March 5, 2018

Evidence-Informed Policymaking Reading List | March 2018

What to Read this Month

“Although the awareness and interest in evidence-informed policymaking has gained momentum in Nigeria, meeting points, such as policymakers’ engagement events to consider issues around the research-policy interface related to MNCH, are essentially lacking.”
This article summarizes the barriers to and facilitators of evidence-informed health policymaking in Nigeria and highlights a need to strengthen capacity at the individual and organizational levels (skills, systems, and processes) to facilitate the evidence to policy process.
“However, while there is a problem when civil servants don’t understand statistics or how to weigh up which evidence sources are reliable, fixing these problems won’t help improve policy if there is no political space to bring evidence into decision making, and no incentives for senior decision makers to care about evidence.”
The final, independent evaluation of the DFID Building Capacity to Use Research Evidence (BCURE) program found that partners were more successful when they went beyond building individual skills and capacities, and engaged with politics and incentives to create changes in the organization and wider environment.
“Many governments collect vast amounts of data, but need support to organize it in a usable format and identify use cases to improve program management.”
Through its government partnerships in Brazil, Chile, Colombia, and Peru, J-PAL has found that in addition to great leadership, the following strategies help support evidence-informed policymaking in government: making it someone’s job to use evidence, helping governments make better use of the data they already collect, investing in long-term partnerships, and taking on quick wins that can build trust and demand for evidence.
“Policy narratives are ‘the communication vehicles that are used for conveying and organizing policy information’, and the development of policy can be understood as a ‘narrative-making’ process.”
This article describes how education policymakers in Australia commonly referenced policy narratives when describing evidence use in their policy development process, and how these narratives in turn helped the research team better make sense of the different ways in which policymakers were using evidence. The article suggests that examining how evidence informs construction, testing, and communication of policy stories or narratives may be a promising and less abstract approach for understanding evidence use in policy development given how prominently narratives and stories feature in policymaking.
“There is an increasing need to develop faster and more reliable learning processes to solve problems and, at the same time, strengthen the trust between citizen and policy institutions. Policy experiments are a great way to do this.”
In this interview, Mikko Annala, Head of Governance Innovation at Demos Helsinki, a Finnish independent think tank, shares three elements that are key to building a culture of policy experimentation: incentives that mandate and support evidence production, experimentation, and failure; committed leadership; and a focus on getting results.
And the results are in! Last month, we used this A/B testing tool from ideas42 to test two different subject lines for this monthly email to see which one got a higher open rate (the percentage of people receiving the email who actually opened it).
  • Version A: Evidence-Informed Policymaking Reading List | February 2018
  • Version B: Which program works best? How do evidence maps inform policy? Is results-based aid more effective?
The Version A and B email blasts had open rates of 25.2% and 20.1%, respectively. The good news is that both rates are higher than the industry average for nongovernmental organizations! However, although Version A looks more effective than Version B, with a p-value of 0.156, the results were not significant at the 10% level. To yield a statistically significant result (to be more confident about which subject line attracts more interest) we could try this test again with a larger sample size. That means we need to send our reading list to more people! If you know anyone who would like to receive this monthly email on evidence-informed policymaking news and research, please forward this message and encourage them to sign up here:


Thanks for reading!
Evidence-Informed Policymaking Reading List | February 2018 — February 7, 2018

Evidence-Informed Policymaking Reading List | February 2018

What to Read this Month

Enhancing Evidence-Informed Decision Making: Strategies for Engagement Between Public Health Faculty and Policymakers in Kenya | Nasreen Jessani et al, Evidence & Policy
“It would behove policymakers in the Kenyan government to lead the change with respect to outreach and inclusion of academic researchers in policy deliberation.”
This article explores the interactions between academic knowledge brokers and health policymakers, and concludes that a combination of personal relationships and institutional partnerships are needed to facilitate engagement.

Is Results-Based Aid More Effective than Conventional Aid? Evidence from the Health Sector in El Salvador | Pedro Bernal et al, Inter-American Development Bank
“Using a difference-in-difference approach and national health systems data we find that preventive health services increased by 19.8% in conventional aid municipalities and by 42% in RBA [results-based aid] municipalities compared to national funds, suggesting that the results-based conditionality roughly doubled aid effectiveness.”
This study is one of the first to measure the effects of results-based financing on the quality of public services provided by government municipalities.

Policy Relevant Evidence Maps: A Method to Inform Decision Making in the Public Sector (Webinar) | Carin van Zyl & Laurenz Langer, GESI
“DPME’s policy-relevant evidence maps follow a rigorous and transparent research process.”
This webinar outlines the steps used in South Africa’s Department of Planning, Monitoring, and Evaluation (DPME) to construct an evidence map to inform decision making in the area of human settlements, and how the process can be adapted elsewhere.

The What Works Network Five Years On | UK Government
“We have hugely talented public sector leaders, but we can still do more to make the best evidence available to them, and to ensure that the time and money invested in our public services are used to the best possible effect.”
In the last five years, the ten What Works Centers in the UK have produced or commissioned 288 evidence reviews used to improve public services; other activities highlighted in this report include publicizing evidence gaps, creating evidence comparison toolkits, and conducting evidence use audits of government departments.

Increasing the Use of Data and Evidence in Real-World Policy: Stories from J-PAL’s Government Partnership Initiative | Samantha Carter & Claire Walsh, J-PAL
“When governments decide to use data and evidence to improve policy, the results can be powerful.”
During its first two years, J-PAL’s Government Partnership Initiative (GPI) has supported 28 partnerships in 15 countries, helping to scale-up effective programs, and improve systems for data and evidence use.

Unsure which version of a program works best? Check out this A/B testing tool from ideas42
“A/B testing works by using an experimental procedure that provides different versions of parts of a program – such as a letter, a web site, or step in the process – to people at random. Statistical analysis can confirm which version is working better and by how much.”
Businesses constantly experiment with A/B testing to refine products and services and how they are marketed, but increasingly, governments are using the strategy to identify which of two service options citizens prefer, and how to communicate messages and create behavior change. This tool can help you prepare two versions to test and what to measure, randomly assign the versions between two groups, and analyze the results to determine which version works best.

Evidence-Informed Policymaking Reading List | January 2018 — January 8, 2018

Evidence-Informed Policymaking Reading List | January 2018

What to Read this Month


“Delegating brokerage to specially designated individuals makes mobilization of knowledge into action highly contingent on their individual preferences, connections and skills.” 
Knowledge brokering is a multidimensional process that should be considered a core function of an organization and carried about by multiprofessional teams, including academics, policymakers, users, managers, and practitioners, rather than expert individuals or intermediary organizations.


“Better policy requires being both honest about our goals and clear-eyed about the evidence.”
To be evidence-based, health policies must 1) be well-specified, 2) be clear about which goals they are meant to achieve, 3) demonstrate empirical evidence of the magnitude of their effects in furthering those goals.


“Overall, leaders use data or analysis more to conduct retrospective assessments of past performance than inform future policy and programs.”
In a 2017 survey, 3500 public officials and development practitioners from 126 low- and middle-income countries tended to favor data that 1) reflected the local context, 2) drew upon data or analysis produced by the government, and 3) went beyond diagnosing problems to provide concrete policy recommendations.


“It is difficult to justify the resources, risks, and opportunity costs of PFS initiatives when an intervention has no evidence base or when existing evidence raises red flags about the impact of a program.”
The article stresses that Pay-for-Success – in which a government repays independent investors if the initiative they financed achieves pre-determined, socially desirable outcomes – will only be effective if used to finance interventions that meet 7 criteria, including a strong evidence base, cost savings for the public sector, clearly defined metrics, and a reasonable time frame.


“The evaluation findings reinforce the wider understanding that the demand forevidence varies substantially depending on individual policymaker attitudes, perceptions about the usefulness of evaluation evidence and credibility of the evaluator, awareness of evaluation benefits, technical skill in evaluation methods and the nature of the political system.”
Lessons learned from the Demand-Driven Evaluations for Decisions (3DE) pilot program in Uganda and Zambia note the program’s limited contribution to evidence-based policymaking capacity and behavior in both countries, and highlight the importance of strengthening capacity in evidence-based decision making within government and of considering the wider political economy in program design.


“Of course, every civil servant need not be a data scientist – but they should appreciate its potential to improve the lives of citizens, help services function more efficiently, and cut costs.”
A helpful compilation of examples and resources to learn more about data literacy, artificial intelligence, design thinking, and behavioral insights – and use them in government.


What We’re Working On


Results for All continues to work with government policymakers and partners to explore the role that a global evidence network could play in helping champion evidence use across public policies and sectors. We have completed our first phase of research and interviewed nearly 50 stakeholders to date – ranging from government policymakers, to NGOs leading evaluation activities, to existing networks where there may be opportunities to collaborate – and will summarize and share what we have learned in the coming months.
Evidence-Informed Policymaking Reading List | December 2017 — December 5, 2017

Evidence-Informed Policymaking Reading List | December 2017

What to Read this Month


“Estonia is the most advanced country with regard to seamless data exchange. Its State Information Agency has mapped all data owned by the national government and provides a standardized technical environment, called the X-Road platform, for secure information sharing with all users in the public and private sectors.”
Offering public services via digital platforms can help governments increase productivity and decrease spending; doing so requires a government-wide digital strategy, an IT platform shared across government departments, and rules governing the use of data accessible through the platform.


“While most purveyors are working to ensure their EBPs are effective and replicable, most are not working to expand their reach.”
This research is focused on purveyors – organizations that take on the job of spreading evidence-based programs. It identifies a lack of resources, expertise, and incentives as key barriers to the spread of evidence-based programs. Successful expansion of programs is often due to external forces such as foundation and government investments, public systems change, and field building, that help to create a demand for services.


“Despite several decades of work on evidence informed policy, the goals to improve evidence uptake and promote greater use of evidence within policy making are still elusive.”  
This paper identifies organizations, systems and infrastructure, access to and availability of evidence, and the interactions between researchers and policymakers as key determinants of evidence use. It recommends strengthening networks and relationships to more optimally inform health policy.

“Policymakers showed great sensitivity to the approach of individuals who present research results. National or regional presenters were generally preferred over international presenters-several interviewees pointed to the importance of peer learning and the influence of regional ‘champions’ for certain issues.”
An INASP investigation found that policymakers in eastern and southern Africa prefer when HIV prevention evidence is presented in clear and brief memos or PowerPoint presentations, provided alongside a series of face-to-face interactions throughout the research process.


“So many important decisions and policies around the world are based on instinct, inertia, ideology or ignorance (four Is) rather than data or rigorous evidence.”
The second part of a three-part series describing the strong partnerships J-PAL has built with governments and policymakers in India, to advance the use of evidence in policymaking.


“The Lab’s mission is to embed the scientific method into the heart of city operations to provide decision-makers with high-quality evidence that they can use to achieve better outcomes for D.C. residents.”
A quick read on how policy labs embedded within government, like the Lab in Washington D.C., can conduct low-cost interventions that help empower policymakers to use data and evidence to improve outcomes for residents.


“We call this ‘zero-credit’ politics: a policy problem can persist because politicians are unable to claim credit from working to solve it.”
The public trusts doctors more than they do politicians, who as a result, don’t ask too many questions and focus instead on preserving their reputation. The partisan competition in the U.S. is another barrier to evidence-based medicine.
“The obsession with the phrase: ‘bridging research and policy’ can be misleading and distracting. There are other relationships that should be strengthened if policy is ever going to be better informed.”
Lessons from the 2017 Latin American Evidence Week include: overly focusing on new innovations can disincentivize the evaluation and tweaking of older programs which may in fact work better; policymakers do not need large-scale impact evaluations on what interventions work as much as they need other forms of research on why interventions work or how to improve them; a broader definition of evidence that includes citizen input can empower vulnerable populations; and formal mechanisms and expectations for dialogue are key – between sectors, between evidence users and producers, and between program designers and implementers.


“But Oxfam’s experience shows that it’s wrong to think that emotion and evidence are opposing choices.”
Oxfam researchers are using facts that stir up emotions, combining personal stories with policy recommendations, and experimenting with other approaches that use evidence to influence attitudes and policies.


What We’re Working On


Results for All is currently working with government policymakers and partners to explore the role that a global evidence network or platform could play in helping government policymakers to address the challenges they face in advancing evidence use across policies and sectors. Reply to this email if you would like to talk to us about your ideas or how to get involved.
Welcome to the first edition of our reading list on evidence-informed policymaking! — November 6, 2017

Welcome to the first edition of our reading list on evidence-informed policymaking!

On the first Monday of every month, we’ll be posting a small selection from our favorite readings, each with a quote and short summary. We’ve noticed a growing body of work on the use of evidence in policy and practice, and hope you will find this compilation a useful way to follow the research, discussions, and insights.
Happy reading!

What to Read this Month


“One of the key lessons is that we need to better understand politics and incentive structures in government organisations when promoting evidence uptake.”
A recent DFID program distinguishes three types of evidence use by policymakers: Transparent Use, Embedded Use, and Instrumental Use. See the table for definitions and examples.


“When we asked survey respondents what barriers they faced in relying on evidence to make decisions, they answered that the most serious barriers were a lack of training in both data analysis and in how to evaluate and apply research findings.”
Civil servants in Pakistan and India also reported that they did not have enough time or incentives to consult evidence before making decisions.


“As well as the time, money and conservation opportunities wasted on ineffective projects, we must consider the possibility that conservation as a whole will be seen as unjustifiable if money is regularly spent poorly because of a lack of evidence use.”
Factors contributing to evidence complacency include a lack of time or training needed to consult scientific evidence, a feeling that relying on evidence reduces professional autonomy to make decisions, and a view that people are more accessible sources of information compared to scientific resources.


“At the same time, because the issues of adolescents are multiple – including education, economic, health and security, to name a few – a cross-sector approach is needed in conducting research and gathering data.”
India is home to 253 million adolescents, or 21% of the global adolescent population. Addressing the needs of such a huge population requires a huge amount of, and commitment to, data and evidence across a variety of social sectors.


“Policy making is not a series of decision nodes into which evidence, however robust, can be ‘fed,’ but the messy unfolding of collective action, achieved mostly through dialogue, argument, influence, conflict and retrospectively made sense of through the telling of stories…”
A good reminder of the complexity, messiness and unpredictability that characterizes the policymaking process.


“The data that a policy index reveals are always in the past, but the impact of a policy index is in the conversations that it informs and the issues that it helps to advance in the future.”
The National Arts Index (NAI) developed by the nonprofit organization Americans for the Arts was the first of its kind and an inspiration for other index projects in the arts internationally. Lessons learned highlight the need for a targeted communications strategy to build awareness about the value of a policy index and the large amounts of time and resources required to build and maintain one.


Looking for even more? See Results for America’s 2017 #WhatWorks Reading List, featuring numerous articles on what US cities, states, and federal agencies are doing to use more evidence to get better results.


What We’re Working On


Results for All is currently working with government policymakers and partners to explore the role that a global evidence network or platform could play in helping government policymakers to address the challenges they face in advancing evidence use across policies and sectors. Reply to this email if you would like to talk to us about your ideas or how to get involved.


For the latest updates, you can follow us on Twitter here. And don’t miss Results for America’s recently released 2017 Federal Invest in What Works Index, which highlights the extent to which 8 US federal departments and agencies have built the infrastructure necessary to use evidence when making budget, policy, and management decisions.



We’ve noticed a growing body of work on the use of evidence in policy and practice. To help you follow the research, discussions, and insights, we’ll send you a small selection of our favorite readings on the first Monday of every month. The reading list will also include our own new blog posts, and highlight what we’re working on. In addition to following us on Twitter, signing up for the reading list is the best way to follow what we’re doing.

Sign up for our Monthly Reading List today here!

How do governments use evidence for policy? 100+ mechanisms and a short survey — October 18, 2017

How do governments use evidence for policy? 100+ mechanisms and a short survey

Results for All is currently assessing whether a global evidence network that facilitates collaboration and an exchange of experiences between policymakers could help to advance and institutionalize the use of evidence in government. We invite you to participate by taking our short survey, here.

The survey will take less than 10 minutes, and will close on October 31.

If you are not a government policymaker, you can still click on the link above and provide input in the space provided. Additionally, we encourage you to contact us at any time to learn more about our work.

We appreciate your support in forwarding the survey to other government policymakers who can help us to assess the demand for a global evidence network.

The Global Landscape Review is here! — July 25, 2017

The Global Landscape Review is here!

Results for All’s just-released “100+ Government Mechanisms to Advance the Use of Data and Evidence in Policymaking: A Landscape Review” and case studies on Ghana, Kenya and Canada can be downloaded here

By Abeba Taddese, Executive Director, Results for All

For the last 18 months, Results for America’s global Results for All initiative has been engaged in a landscape review to understand the different approaches governments are taking to create formal strategies and mechanisms – specifically, policies, programs, platforms, systems and operational practices – to advance and institutionalize the use of data and evidence in decision making.

We’ve had a fulfilling year of learning from government leaders, experts and citizens around the world, and we are eager to share some of our insights here:

  • The last 5 to 7 years have been a busy time for governments. Outside of the long-established evaluation systems in countries like Mexico, Colombia and Chile, we observe that many of the formal structures governments are putting in place to support evidence-informed policymaking (EIP) are quite recent. Separately, we note a growing body of literature on evidence-informed policymaking, notably exploring constraints or barriers to EIP, and factors that enable data- and evidence-driven decision making.                                                                                                                           
  • While institutional strategies and mechanisms are necessary and often a precondition for routine and consistent use of data and evidence in policy and programs, they aren’t enough on their own. There is widespread agreement among policymakers and evidence producers alike, that policymaking is complex, multi-dimensional, and influenced by many factors. It is far from a linear “evidence in, policy out” process. Contextual factors ranging from leadership, commitment and allocation of resources to political climate, values and belief systems are critical influences in any policy process.                                                                                                                                                                                                                                          
  • Governments are taking different, context-specific approaches to creating formal strategies and mechanisms. And they are sharing information about their processes and learning from each other. The study tours to Mexico, Colombia and the United States that helped to inform South Africa’s monitoring and evaluation system, the data-driven community safety approach in Saskatchewan, Canada (Hub) adapted from Scotland’s Violence Reduction Unit model, and the collaboration between the Office of the Prime Minister in Uganda and Malaysia’s Performance Management and Delivery Unit (PEMANDU) are a few examples that stand out.                                                                                                                                        
  • Ultimately, EIP isn’t about a specific approach or type of evidence, but rather finding context-appropriate ways to make better use of data and evidence in real-life policy and program decisions. This last point is worth underscoring, in the spirit of ensuring that we don’t end up with a jargon-laden theoretical field that distracts the EIP community – whether government actors, nongovernmental organization partners (NGOs) or the philanthropic community – from the end goal of achieving better outcomes for populations.                                                                                                                                                                                                                               
  • There appears to be an emphasis in government on creating structures and systems to improve access to data and evidence, while NGOs are playing a more central role in facilitating partnerships between policymakers and evidence producers as well as building the individual capacity of policymakers. Our review is not exhaustive or definitive, so we can’t say for certain why this might be the case. But we surmise that there may be a “build and they will come” approach to the use of data and evidence in government policymaking, and that governments may prioritize spending of finite resources on tangible infrastructure. For NGOs, partnership building and training activities often offer less bureaucratic and politicized entry points for supporting government efforts to advance EIP.

The landscape review is accompanied by resource tables and a series of case studies on evidence-informed policymaking training in Ghana, demographic dividend policies in Kenya, and a community safety strategy in Canada. Our goal for this body of work that identifies more than 100 strategies and mechanisms for advancing the use of data and evidence in government policy and practice, is to promote a sharing of experiences and lessons learned among leaders in government, NGOs and other partners.

We’ll be building on this work in the months ahead, and close with a few questions we hope to explore further:

  • How effective are government strategies and mechanisms in promoting the use of data and evidence? Are there approaches that are more effective than others in improving the use of evidence, and that ultimately have the greatest impact in achieving development objectives?                                                                        
  • How can governments be best supported in their efforts to institutionalize the use of data and evidence? Could structured joint learning and networking approaches help to accelerate the adoption of strategies and mechanisms for advancing the use of data and evidence?

We are grateful to the experts interviewed for this review, who contributed their time and input (you can find many of them listed in Appendix 2 of the report), and to the William and Flora Hewlett Foundation for generously supporting this work.

We encourage you to continue visiting the Evidence in Action blog for updates. If you have questions or would like more information, please contact me at Abeba@results4all. And please share your feedback with us by tweeting at @resultsforall with the hashtag #GlobalLandscapeReview.

Thanks for reading!