Evidence in Action

The Results for All Blog

Evidence-Informed Policymaking Reading List | July 2018 — July 3, 2018

Evidence-Informed Policymaking Reading List | July 2018

Our Network Mapping Report is Here!
 
Peer learning networks are powerful platforms for bringing countries together to exchange information, learn, and collaborate. Results for All conducted research to assess whether there is demand in government for a peer learning network that could help policymakers share experiences and amplify solutions to advance the use of evidence in policy, to achieve better outcomes for citizens. To inform a network strategy and avoid potential overlap with existing initiatives, we engaged in a comprehensive process of identifying, researching, and mapping active networks that have similar missions. The subsequent report identifies and classifies 50+ networks for peer learning and evidence use in government; describes 8 modes of engaging network members; and synthesizes research and key informant interviews into 13 lessons on network organization, engagement, and measurement. It then matches select networks against 5 criteria and concludes that a new network premised on these criteria could support evidence-informed policymaking and add value to current initiatives.

We hope this report will be useful for a variety of actors seeking to support evidence-informed policymaking, and identify opportunities to enhance collaboration and fill gaps in this important field.

Results for All Network Mapping Report


What to Read this Month

“An accurate, concise and unbiased synthesis of the available evidence is arguably one of the most valuable contributions a research community can offer decision-makers.”
The article identifies four principles that can help it make it easier for evidence producers and users to commission, appraise, share, and use evidence in policy. These four principles – inclusive, rigorous, transparent, and accessible – should apply to every evidence synthesis, which if done well becomes a global public good.

“The innovative contribution of the current study is the development and validation of a set of measurable indicators specifically devoted to assess and support EIPM in the field of public health, intended to be jointly used by governmental policy-makers and researchers, but also by other stakeholders involved in various stages of the policy-making cycle.”
The article describes a Delphi study that led to the development of indicators that can be used to assess the extent to which policies are informed by evidence. The indicators cover issues related to the skill and experience of staff working on policies, documentation of evidence in policy documents, communication and participation with key stakeholders, and procedures for monitoring and evaluating evidence use in policy. The indicators could also help to encourage establishment of routine processes for advancing evidence use in policy.

“There’s a limited evidence base about knowledge brokers, but preliminary findings suggest that they do have the potential to improve the uptake of evidence.”
Insights from the Wales Center for Public Policy on the role it plays as a knowledge broker in evidence-informed policymaking – helping to  build an understanding of evidence needs and questions, improve access to evidence, promote  interaction between evidence users and producers, and strengthen capacity to engage with research. The Center is refining its theory of change to better understand approaches that work, and to take a more systematic approach to facilitating evidence use in policymaking.

“Just as a journalist is trained to tell a compelling story so that an audience’s attention is captured and held so that facts of a story can be relayed to a reader or viewer, so too do scientists or policy experts need to capture attention and communicate both the importance and complexity of issues to their audiences.”
A useful article describing how storytelling influences the policy process and offering key steps to help policy actors build a better narrative.

“Guerrero, a Harvard-educated surgeon-turned-epidemiologist, understood violence as an epidemic transmitted from person to person. As with any epidemic, he tried to map the outbreak and understand its transmission. Data came first.”
A great story about data-driven policing and violence prevention in Colombian cities.

What We’re Working On

Later this month, we’re convening teams of government policymakers from nine countries to share experiences, challenges, and lessons on how to use different types of evidence to overcome roadblocks, create political buy-in, engage with stakeholders, and mobilize financial resources to support and improve policy implementation. “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” will take place from July 23-25in Nairobi, Kenya, in partnership with the African Institute for Development Policy (AFIDEP) and IDinsight. We’ll share profiles of the participating teams and the policies they are working on in the coming weeks on Twitter (@resultsforall) so be sure to follow us there!
Advertisements
Evidence-Informed Policymaking Reading List | June 2018 — June 1, 2018

Evidence-Informed Policymaking Reading List | June 2018

What to Read this Month

 

“In contrast to theories of change that posit that more rigorous evidence will have a greater influence on officials, we have found the opposite to be true.”
Excerpts from a paper reflecting on ten years of trying to improve public health in Guatemala. A key takeaway from the paper is the more community members were involved in generating and presenting evidence, the greater the likelihood that it would be used to address service delivery challenges.

 

“Our argument is that to improve policy execution we must go one step further and consider how policies can be more effectively designed by connecting actors vertically and horizontally in a process of collaboration and joint deliberation.”
This article describes how collaboration in policy design can facilitate adaptive implementation of policy solutions. For example, collaboration with front line staff can help government leaders better understand challenges faced at the service delivery level and propose context specific solutions, while collaboration with citizens can generate constructive feedback that stimulates learning and incremental adjustments to a policy solution. The article also discusses how open and democratic political contexts enable collaborative policymaking.

 

“Neither country policy makers nor the global development community are best served by a global flood of health estimates derived from complex models as investments in country data collection, analytical capacity, and use are lagging.”
A brief commentary illustrating how the development community’s emphasis on global health predictions and estimates can contribute to a false level of certainty about health status and trends, and importantly, detract from needed investments in improving data collection and analytical capacity in countries.

 

“Impact evaluations are an important tool for learning about effective solutions to social problems, but they are a good investment only in the right circumstances. In the meantime, organizations must build an internal culture in which the right data are regularly collected, analyzed, and applied to manage implementation and improve programs.”
A comprehensive summary of the authors’ recent book, in which they explain why impact evaluations are not always useful, describe ten scenarios in which alternatives to impact evaluations make sense, and provide four guiding principles for collecting data that will be used to inform decisions and improve programs.

 

“Within a year of the research getting published, and publicised by media and labour unions, the Maharashtra government raised the minimum wage to INR 12. This benefited 6 million labourers in the state-many, many times more than what we could have hoped to achieve if we had not adopted a research-based approach to the problem.”
Reflecting on his experiences as a medical physician and researcher, Dr. Bang discusses the importance of research for development, and the difference between research on the people and research for and with the people.

 

“Even the simplest intervention is context dependent in countless, subtle ways – it’s impossible to say with certainty how it will fare in another place. However, as presented here, there’s a four-step framework that can disqualify many mistakes before they happen, and improve the odds of replications you pursue.”
Useful insights, examples, and a framework to determine when an intervention can be replicated in a new context: assess the evidence, understand what conditions made the intervention work in the original context, ensure that the new context has those same essential conditions, and adapt the intervention as necessary.

 

African Research Organizations: Submit your Proposals by June 15

 

The Hewlett Foundation call for proposals aims to advance government use of evidence by supporting East and West African policy research organizations. They can apply alone or lead a project with 1-2 partner organizations. See more here.

 

What We’re Working On

 

We’re busy organizing “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers,” which we are hosting in partnership with the African Institute for Development Policy (AFIDEP) and IDinsight from July 23-25 in Nairobi, Kenya.

 

The workshop will provide a forum for government teams from 9 countries to learn about and share experiences, challenges, good practices, and key lessons on how to use different types of evidence to overcome roadblocks, create political buy-in, engage with stakeholders, and mobilize financial resources to support and improve policy implementation. We selected 10 great teams from over 55 applications. A big thank you to everyone who applied and helped to spread the word! More info coming soon.

 

Evidence-Informed Policymaking Reading List | May 2018 —

Evidence-Informed Policymaking Reading List | May 2018

New Opportunity for African Research Organizations

On May 1, the Hewlett Foundation launched a call for proposals for African Policy Research Institutions to Advance Government Use of Evidence. The call is meant to support policy research organizations in East and West Africa in their efforts to advance evidence-informed policymaking, with a focus on strengthening policymakers’ capacity, motivation, processes, and relationships to enhance evidence use. African policy research organizations can apply alone or lead a project with 1-2 partner organizations. Submit clarifying questions about the call for proposals to EIPAfrica@hewlett.org by May 8. The deadline for first-round proposals is June 15. Learn more at https://www.hewlett.org/eipafrica/.
May EIP Reading List Photo

How can you ensure that evidence used to inform policy is appropriate, credible, and transparent? What types of evidence are appropriate for each stage of the policy cycle, including design and implementation? New Zealand’s Social Policy Evaluation and Research Unit, Superu, answers these questions and more in their latest guide.

“We should be more ambitious than just making ourselves (and the experts we work with) useful to the various groups we hope might use the research and policy analysis our institutions produce.”
It’s important for researchers to package and communicate information in a way that is accessible to policymakers, but it’s even more critical to invest in longer-term partnerships with policymakers to truly influence social change.

“Both making evidence accessible and facilitating processes for deliberating evidence were essential in supporting evidence users to understand the extent and usefulness of evidence and identify implications for policy, practice and services.”
This paper describes a process developed by the Centre for Research on Families and Relationships in the UK, to facilitate the use of evidence in practice. The process helped practitioners identify gaps in knowledge; facilitated an evidence review to refine the research question, find, synthesize, and report on evidence using tools and templates; and concluded with a discussion to plan for the use of evidence.

“Governments have used the phrase ‘let a thousand flowers bloom’ to symbolise a desire to entertain many models of policy-making.”
Evidence-based decision making can take different approaches, ranging from implementation science driven by RCTs, to storytelling that draws on practitioner and user feedback, to improvement methods that draw from a combination of experience and operational evidence. The challenge is knowing what type of evidence to use given the complexity and politics of the policymaking process.

“The establishment of Science and Technology (S&T) fellowship programs in other states could greatly increase evidence-based policy-making and not only benefit state policy-makers but also help to inform national policy-making and society as a whole.”
The editorial reflects on the success of California’s S&T Policy Fellows Program, which selects and trains individuals with a science or technology background to serve as full-time legislative staff for a year. The Fellows are in high demand, and as a result of the program the use of evidence in state policymaking has improved dramatically.

What We’re Working On

We’re partnering with the African Institute for Development Policy (AFIDEP) to host a peer learning workshop on Using Evidence for Policy Implementation from July 23-25 in Nairobi, Kenya. Teams of 3-4 policymakers from governments around the world will be fully-funded to participate in the workshop. Participants will work in country teams to discuss and diagnose the root causes of policy implementation challenges, create solution-based roadmaps, and exchange experiences and lessons on evidence use in policy implementation across different contexts.

There is only 1 week left to apply! Please find more details and application instructions here.

 

Open Call for Applications: Peer Learning Workshop on Policy Implementation — April 24, 2018

Open Call for Applications: Peer Learning Workshop on Policy Implementation

Workshop Banner

Applications are due by May 14, 2018 via email to info@results4all.org.

Download the application here.

The Workshop:

Results for All and the African Institute for Development Policy (AFIDEP) will host a two-and-a-half-day workshop for policymakers from around the world to:

  • Discuss the challenges governments face in effectively implementing policies; and
  • Share experiences and strategies for using evidence to improve policy implementation.

The workshop will:

  • Facilitate dialogue, exchange, and active engagement among participants, to more deeply understand policy implementation challenges and lessons learned in different contexts; and
  • Introduce tools and approaches for improving implementation using various types of evidence.

During the workshop, participants will seek to answer questions such as, what are the most common barriers to effective policy implementation in different government office contexts? What type of evidence is needed to unlock implementation? How and when should it be considered? What strategies and mechanisms are governments in different countries introducing to improve and integrate evidence use in policy design and implementation? How can we learn from their experiences?

Workshop Outcomes:

  • Participants will learn from and interact with peers leading policy implementation activities from 7-8 national governments.
  • Participants will work in country teams to diagnose root causes of policy implementation challenges and create solution-based roadmaps.
  • Participants will provide feedback and shape future collaboration, including a potential global network for government leaders to advance the use of evidence in public policy.

Who Should Participate?

Results for All and AFIDEP invite public officials and policymakers to form a team of three or four individuals who are working together to implement a specific policy, in any sector, and who want to learn how and when to use evidence to overcome policy implementation challenges. A team must include members from at least two government ministries / departments / agencies, and be approved by senior leadership via the signature at the end of this application. Teams from seven to eight countries will be selected for participation in the workshop.

Teams are encouraged to include:

  • A program manager or director in a ministry / department / agency, who oversees the implementation of the policy in question.
  • A public official or practitioner at the national or subnational level, who has a role in operationalizing the policy, or collecting operational data and evidence.
  • An analyst, manager, or director from a national finance or planning ministry / department, who has a coordinating role in managing or evaluating policy.
  • A technical expert from a research or evaluation unit or statistical office, who has a role in producing or sourcing evidence to inform policy options or implementation strategies.

Teams will be expected to develop a power point presentation outlining a policy implementation challenge in advance of the workshop, and to engage in follow-up activities to put roadmaps into practice.

Teams from Brazil, Colombia, Ghana, India, Indonesia, Kenya, Malawi, Mexico, Nigeria, Philippines, Rwanda, South Africa, Tanzania, Uganda, and Zambia are especially encouraged to apply.

Download the application and apply here.

Evidence-Informed Policymaking Reading List | April 2018 — April 2, 2018

Evidence-Informed Policymaking Reading List | April 2018

What to Read this Month

Advancing Evidence-Informed Policymaking: What’s Culture Got to Do With It? | Abeba Taddese, Results for All
“It takes time and intentional effort to build or change the evidence culture in government. And to truly do so, we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work.”
We often hear from policymakers and partners about the importance of building or shifting the culture of evidence use, but what does that mean? Our latest blog explores the notion of an evidence culture and how to support it.

 

Ian Goldman is Acting Deputy Director General and Head of Evaluation and Research in South Africa’s Department of Planning, Monitoring and Eva
luation (DPME). His presentation, made at the Africa Evidence Forum in Nairobi last month, shares challenges and lessons for evidence use in the policy process. See his plea to researchers on the last slide!

 

“Because knowledge mobilisation means different things to different people it can also be difficult for new knowledge mobilisers to identify and clarify their role and communicate this effectively. This increases the risk of misunderstandings and misalignment between knowledge mobilisers and those they are working with.”
This article offers a practical framework based on a series of questions that can help knowledge mobilizers better understand their role: Why is knowledge being mobilized? Whose knowledge is being mobilized? What type of knowledge is being mobilized? How is knowledge being mobilized?

 

“Often, innovation starts in the field, and practice gradually influences policy over time. That means it is as important for research uptake to get communities of practice to engage with research.”
Learnings from Secure Livelihood’s Research Consortium’s work in DRC emphasize the need to focus not only on decision-makers in the research uptake equation, but to also consider the role of researchers. The Consortium has learned that to improve research uptake, it is important to strengthen institutional research capabilities as well as individual research skills, and to emphasize the value of research early, for example in higher education curricula, while also building relationships with policymakers.

 

“Rather than making large-scale decisions in the absence of fully conclusive or satisfactory evidence, experimentation can help to “de-risk” the decision-making process at this step of the policy cycle.”
This article encourages governments to consider small-scale experiments as a means to test new approaches and move incrementally toward larger evidence-informed policies and programs. It highlights what the government of Canada is doing in this regard, and also includes a nice description of how experimentation could be incorporated in each step of the policy cycle.

 

“Data and evidence can overcome such myths, but try and use government data – policy makers are more comfortable when you use data they know and trust.”
The Private Secretary to former President Kikwete in Tanzania shares insights on who to target, when to act, and how to communicate the message to improve your policy advocacy.

 

What We’re Working On

 

We’re putting the finishing touches on our network mapping report, in which we assess the demand for a global evidence network, and review 50+ existing networks for peer learning and evidence use in government. We also synthesize 13 lessons on network organization, engagement, and measurement. Thank you to everyone who shared insights and feedback! We look forward to publishing the report in the next few weeks.
Advancing evidence-informed policymaking: What’s culture got to do with it? — March 29, 2018

Advancing evidence-informed policymaking: What’s culture got to do with it?

Over the last few months my team at Results for All has been engaged in consultations to assess the demand for a new global evidence network that could bring government policymakers together to exchange innovative ideas and learn from each other to advance evidence use in policymaking.

We have spoken to policymakers in government, members of the research and academic community, as well as non-governmental partners and initiatives in countries including Colombia, Chile, Finland, Nigeria, South Africa, Tanzania, Uganda, and Zimbabwe, among many others. In every conversation, we heard about the importance of building or shifting the culture of evidence use. While we expect and assume that organizational culture will be different in varied contexts, we observed an interesting tendency in the policymaking community to speak about culture and evidence use in a way that suggested some universality across policy areas and levels of government. We noted further that in the context of evidence use, culture was often spoken of in broad and vague terms, such as “the culture is not developed enough,” “there is no culture of producing data,” or “mid-level technocrats have a lot of influence, and the ability to shift government culture.”

We are curious about the notion of an evidence use culture in government, and believe it is essential to better understand this culture so we can identify strategies to help strengthen evidence use in government. 

What is culture?

The challenge in understanding what a culture of evidence use in government looks like begins with the definition of culture itself, a term with many meanings. The first of Merriam Webster’s six definitions for culture describes it as a set of attitudes, values, goals, and practices shared across an institution or organization. Matsumoto et al suggest that while attitudes, values, and goals can be shared by a group, they can also be differentiated at an individual level.

This practical guide on changing culture developed by Bloomberg Philanthropies’ What Works Cities initiative offers a definition of culture that gets at norms: “culture is the difference between what you tolerate and don’t tolerate.” According to the guide, culture embodies interactions between the different elements of a system such as people, beliefs, values, and attitudes. It is both causal and dependent on an organization’s knowledge, processes, and systems. It is not a singular thing – an individual or organization can be defined by multiple cultures. And it is both learned and a legacy that can be shaped over time. These conflicting and dynamic elements are what make culture hard to define.

Levels of culture

To understand culture as it relates to evidence use in government, it is helpful to explore the different levels in which culture presents itself in an organization. This includes artifacts, values, and assumptions, captured in a helpful visual here.

The visible and tangible elements of an organization are its artifacts. They are what you see when you walk into an office – desks, chairs, computers, plants, and filing systems. Reports, briefs, databases, and knowledge management systems are also types of artifacts. Artifacts can give a sense of office culture – we might for example, assume that a brightly colored office with an open floor plan has a creative mission, and sense entrenched bureaucracy in a dark, traditionally furnished office. Or we might expect an office with the technology for collecting and storing data, to routinely use evidence to inform policy and programs.

Yet these visual cues about an office’s culture may be misleading if we do not understand the organization’s values and the underlying assumptions that drive the daily work of its leaders and employees. For example, a government office may have the relevant evidence artifacts such as a knowledge management system or evaluations, but lack shared values to guide and encourage evidence use in decision making. But even when there are tangible artifacts, and a government office publicly articulates the value of using evidence in policymaking, if the underlying assumption is that using evidence is too costly or time consuming, the office is unlikely to translate its artifacts and values to systematic use of evidence in policy decisions. The challenge is that it can be hard to uncover hidden assumptions – feelings, perceptions, thoughts, or beliefs – that shape an organization’s visible artifacts and values. Artifacts and values can also be disconnected and even contradictory, most noticeably in government when financial commitments needed to support desired policies or policymaker behavior do not line up with a government’s stated values.

In the context of evidence-informed policymaking, it is important to build artifacts – the systems and processes governments need to ensure evidence is appropriately sourced and used to inform strategic thinking, policy development, implementation of policy options, and monitoring and evaluation. It is also critical to build and instill a shared and publicly expressed value in using evidence. But to influence behavior change and shift attitudes about evidence use, it is imperative that we consider the basic assumptions that guide how work is done and decisions are made. When what we say (reflecting values) does not align with how we behave (building and using artifacts), it is a sign that we need to dig deeper to understand the assumptions that govern our behavior

What should governments do to strengthen underlying assumptions and shift the culture toward evidence use?

    1. Take time to know the office – For many government offices, a conversation to understand barriers and challenges that inhibit evidence use, and clarify performance expectations and intended outcomes of policies, is a good starting point for those who would like to see greater use of evidence in policymaking. Build the communications skills to hold these conversations. A needs assessment can help to diagnose the gaps in knowledge, awareness, and capacity that can influence assumptions around what it takes to find, understand, and use evidence.
    2. Invest in leaders and champions – Strong role models who demonstrate the importance of using evidence through their actions can inspire others and help to change behavior patterns. Highlighting respected leaders who support innovation, change, and learning can positively influence other public officials’ assumptions and attitudes toward evidence use.
    3. Build knowledge and awareness – Policymakers who are confident in their ability to find, appraise, and synthesize evidence, and who understand the complexities of the policymaking process, are more likely to use evidence in their decision making process. Training courses or events such as dedicated research weeks can raise awareness about the value of using evidence and change assumptions that using evidence is too intimidating or complex.
    4. Create a compelling narrative – Ruth Levine gets at a moral argument for evidence-informed policymaking here and here. Moving from a compliance and monitoring mindset to a compelling narrative that points to failed outcomes for citizens when we do not use evidence can be a way to shift attitudes and behavior toward evidence use. Make responsible allocation and spending of limited government resources about doing right by citizens – achieving healthier populations, delivering quality education for all, accelerating financial empowerment for women.
    5. Promote networks and relationships – Whether formal or informal, peer interactions can help policymakers strengthen technical skills and shift attitudes and assumptions by exposing them to new ideas. As an organization, this could mean giving staff the time and space to connect with each other to share information, lessons, and experiences.
    6. Recognize and reward desired behavior – Different strategies can be used to motivate policymakers to use evidence in decision making, ranging from financial performance incentives to less resource-intensive award and recognition programs. Governments can use these strategies to promote and reward desired behavior, nudging policymakers to shift their assumptions and actions to align with organizational values.

It takes time and intentional effort to build or change the evidence culture in government. And to truly do so, we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work. These assumptions determine whether values become empty statements and artifacts gather dust or, ultimately, whether evidence use becomes a cultural norm.

Abeba Taddese is the Executive Director of Results for All, a global initiative of Results for America.

Evidence-Informed Policymaking Reading List | March 2018 — March 5, 2018

Evidence-Informed Policymaking Reading List | March 2018

What to Read this Month

“Although the awareness and interest in evidence-informed policymaking has gained momentum in Nigeria, meeting points, such as policymakers’ engagement events to consider issues around the research-policy interface related to MNCH, are essentially lacking.”
This article summarizes the barriers to and facilitators of evidence-informed health policymaking in Nigeria and highlights a need to strengthen capacity at the individual and organizational levels (skills, systems, and processes) to facilitate the evidence to policy process.
“However, while there is a problem when civil servants don’t understand statistics or how to weigh up which evidence sources are reliable, fixing these problems won’t help improve policy if there is no political space to bring evidence into decision making, and no incentives for senior decision makers to care about evidence.”
The final, independent evaluation of the DFID Building Capacity to Use Research Evidence (BCURE) program found that partners were more successful when they went beyond building individual skills and capacities, and engaged with politics and incentives to create changes in the organization and wider environment.
“Many governments collect vast amounts of data, but need support to organize it in a usable format and identify use cases to improve program management.”
Through its government partnerships in Brazil, Chile, Colombia, and Peru, J-PAL has found that in addition to great leadership, the following strategies help support evidence-informed policymaking in government: making it someone’s job to use evidence, helping governments make better use of the data they already collect, investing in long-term partnerships, and taking on quick wins that can build trust and demand for evidence.
“Policy narratives are ‘the communication vehicles that are used for conveying and organizing policy information’, and the development of policy can be understood as a ‘narrative-making’ process.”
This article describes how education policymakers in Australia commonly referenced policy narratives when describing evidence use in their policy development process, and how these narratives in turn helped the research team better make sense of the different ways in which policymakers were using evidence. The article suggests that examining how evidence informs construction, testing, and communication of policy stories or narratives may be a promising and less abstract approach for understanding evidence use in policy development given how prominently narratives and stories feature in policymaking.
“There is an increasing need to develop faster and more reliable learning processes to solve problems and, at the same time, strengthen the trust between citizen and policy institutions. Policy experiments are a great way to do this.”
In this interview, Mikko Annala, Head of Governance Innovation at Demos Helsinki, a Finnish independent think tank, shares three elements that are key to building a culture of policy experimentation: incentives that mandate and support evidence production, experimentation, and failure; committed leadership; and a focus on getting results.
And the results are in! Last month, we used this A/B testing tool from ideas42 to test two different subject lines for this monthly email to see which one got a higher open rate (the percentage of people receiving the email who actually opened it).
  • Version A: Evidence-Informed Policymaking Reading List | February 2018
  • Version B: Which program works best? How do evidence maps inform policy? Is results-based aid more effective?
The Version A and B email blasts had open rates of 25.2% and 20.1%, respectively. The good news is that both rates are higher than the industry average for nongovernmental organizations! However, although Version A looks more effective than Version B, with a p-value of 0.156, the results were not significant at the 10% level. To yield a statistically significant result (to be more confident about which subject line attracts more interest) we could try this test again with a larger sample size. That means we need to send our reading list to more people! If you know anyone who would like to receive this monthly email on evidence-informed policymaking news and research, please forward this message and encourage them to sign up here: https://results4america.org/results-for-all-reading-list/

 

Thanks for reading!
Evidence-Informed Policymaking Reading List | February 2018 — February 7, 2018

Evidence-Informed Policymaking Reading List | February 2018

What to Read this Month

Enhancing Evidence-Informed Decision Making: Strategies for Engagement Between Public Health Faculty and Policymakers in Kenya | Nasreen Jessani et al, Evidence & Policy
“It would behove policymakers in the Kenyan government to lead the change with respect to outreach and inclusion of academic researchers in policy deliberation.”
This article explores the interactions between academic knowledge brokers and health policymakers, and concludes that a combination of personal relationships and institutional partnerships are needed to facilitate engagement.

Is Results-Based Aid More Effective than Conventional Aid? Evidence from the Health Sector in El Salvador | Pedro Bernal et al, Inter-American Development Bank
“Using a difference-in-difference approach and national health systems data we find that preventive health services increased by 19.8% in conventional aid municipalities and by 42% in RBA [results-based aid] municipalities compared to national funds, suggesting that the results-based conditionality roughly doubled aid effectiveness.”
This study is one of the first to measure the effects of results-based financing on the quality of public services provided by government municipalities.

Policy Relevant Evidence Maps: A Method to Inform Decision Making in the Public Sector (Webinar) | Carin van Zyl & Laurenz Langer, GESI
“DPME’s policy-relevant evidence maps follow a rigorous and transparent research process.”
This webinar outlines the steps used in South Africa’s Department of Planning, Monitoring, and Evaluation (DPME) to construct an evidence map to inform decision making in the area of human settlements, and how the process can be adapted elsewhere.

The What Works Network Five Years On | UK Government
“We have hugely talented public sector leaders, but we can still do more to make the best evidence available to them, and to ensure that the time and money invested in our public services are used to the best possible effect.”
In the last five years, the ten What Works Centers in the UK have produced or commissioned 288 evidence reviews used to improve public services; other activities highlighted in this report include publicizing evidence gaps, creating evidence comparison toolkits, and conducting evidence use audits of government departments.

Increasing the Use of Data and Evidence in Real-World Policy: Stories from J-PAL’s Government Partnership Initiative | Samantha Carter & Claire Walsh, J-PAL
“When governments decide to use data and evidence to improve policy, the results can be powerful.”
During its first two years, J-PAL’s Government Partnership Initiative (GPI) has supported 28 partnerships in 15 countries, helping to scale-up effective programs, and improve systems for data and evidence use.

Unsure which version of a program works best? Check out this A/B testing tool from ideas42
“A/B testing works by using an experimental procedure that provides different versions of parts of a program – such as a letter, a web site, or step in the process – to people at random. Statistical analysis can confirm which version is working better and by how much.”
Businesses constantly experiment with A/B testing to refine products and services and how they are marketed, but increasingly, governments are using the strategy to identify which of two service options citizens prefer, and how to communicate messages and create behavior change. This tool can help you prepare two versions to test and what to measure, randomly assign the versions between two groups, and analyze the results to determine which version works best.

Evidence-Informed Policymaking Reading List | January 2018 — January 8, 2018

Evidence-Informed Policymaking Reading List | January 2018

What to Read this Month

 

“Delegating brokerage to specially designated individuals makes mobilization of knowledge into action highly contingent on their individual preferences, connections and skills.” 
Knowledge brokering is a multidimensional process that should be considered a core function of an organization and carried about by multiprofessional teams, including academics, policymakers, users, managers, and practitioners, rather than expert individuals or intermediary organizations.

 

“Better policy requires being both honest about our goals and clear-eyed about the evidence.”
To be evidence-based, health policies must 1) be well-specified, 2) be clear about which goals they are meant to achieve, 3) demonstrate empirical evidence of the magnitude of their effects in furthering those goals.

 

“Overall, leaders use data or analysis more to conduct retrospective assessments of past performance than inform future policy and programs.”
In a 2017 survey, 3500 public officials and development practitioners from 126 low- and middle-income countries tended to favor data that 1) reflected the local context, 2) drew upon data or analysis produced by the government, and 3) went beyond diagnosing problems to provide concrete policy recommendations.

 

“It is difficult to justify the resources, risks, and opportunity costs of PFS initiatives when an intervention has no evidence base or when existing evidence raises red flags about the impact of a program.”
The article stresses that Pay-for-Success – in which a government repays independent investors if the initiative they financed achieves pre-determined, socially desirable outcomes – will only be effective if used to finance interventions that meet 7 criteria, including a strong evidence base, cost savings for the public sector, clearly defined metrics, and a reasonable time frame.

 

“The evaluation findings reinforce the wider understanding that the demand forevidence varies substantially depending on individual policymaker attitudes, perceptions about the usefulness of evaluation evidence and credibility of the evaluator, awareness of evaluation benefits, technical skill in evaluation methods and the nature of the political system.”
Lessons learned from the Demand-Driven Evaluations for Decisions (3DE) pilot program in Uganda and Zambia note the program’s limited contribution to evidence-based policymaking capacity and behavior in both countries, and highlight the importance of strengthening capacity in evidence-based decision making within government and of considering the wider political economy in program design.

 

“Of course, every civil servant need not be a data scientist – but they should appreciate its potential to improve the lives of citizens, help services function more efficiently, and cut costs.”
A helpful compilation of examples and resources to learn more about data literacy, artificial intelligence, design thinking, and behavioral insights – and use them in government.

 

What We’re Working On

 

Results for All continues to work with government policymakers and partners to explore the role that a global evidence network could play in helping champion evidence use across public policies and sectors. We have completed our first phase of research and interviewed nearly 50 stakeholders to date – ranging from government policymakers, to NGOs leading evaluation activities, to existing networks where there may be opportunities to collaborate – and will summarize and share what we have learned in the coming months.
Evidence-Informed Policymaking Reading List | December 2017 — December 5, 2017

Evidence-Informed Policymaking Reading List | December 2017

What to Read this Month

 

“Estonia is the most advanced country with regard to seamless data exchange. Its State Information Agency has mapped all data owned by the national government and provides a standardized technical environment, called the X-Road platform, for secure information sharing with all users in the public and private sectors.”
Offering public services via digital platforms can help governments increase productivity and decrease spending; doing so requires a government-wide digital strategy, an IT platform shared across government departments, and rules governing the use of data accessible through the platform.

 

“While most purveyors are working to ensure their EBPs are effective and replicable, most are not working to expand their reach.”
This research is focused on purveyors – organizations that take on the job of spreading evidence-based programs. It identifies a lack of resources, expertise, and incentives as key barriers to the spread of evidence-based programs. Successful expansion of programs is often due to external forces such as foundation and government investments, public systems change, and field building, that help to create a demand for services.

 

“Despite several decades of work on evidence informed policy, the goals to improve evidence uptake and promote greater use of evidence within policy making are still elusive.”  
This paper identifies organizations, systems and infrastructure, access to and availability of evidence, and the interactions between researchers and policymakers as key determinants of evidence use. It recommends strengthening networks and relationships to more optimally inform health policy.

“Policymakers showed great sensitivity to the approach of individuals who present research results. National or regional presenters were generally preferred over international presenters-several interviewees pointed to the importance of peer learning and the influence of regional ‘champions’ for certain issues.”
An INASP investigation found that policymakers in eastern and southern Africa prefer when HIV prevention evidence is presented in clear and brief memos or PowerPoint presentations, provided alongside a series of face-to-face interactions throughout the research process.

 

“So many important decisions and policies around the world are based on instinct, inertia, ideology or ignorance (four Is) rather than data or rigorous evidence.”
The second part of a three-part series describing the strong partnerships J-PAL has built with governments and policymakers in India, to advance the use of evidence in policymaking.

 

“The Lab’s mission is to embed the scientific method into the heart of city operations to provide decision-makers with high-quality evidence that they can use to achieve better outcomes for D.C. residents.”
A quick read on how policy labs embedded within government, like the Lab in Washington D.C., can conduct low-cost interventions that help empower policymakers to use data and evidence to improve outcomes for residents.

 

“We call this ‘zero-credit’ politics: a policy problem can persist because politicians are unable to claim credit from working to solve it.”
The public trusts doctors more than they do politicians, who as a result, don’t ask too many questions and focus instead on preserving their reputation. The partisan competition in the U.S. is another barrier to evidence-based medicine.
“The obsession with the phrase: ‘bridging research and policy’ can be misleading and distracting. There are other relationships that should be strengthened if policy is ever going to be better informed.”
Lessons from the 2017 Latin American Evidence Week include: overly focusing on new innovations can disincentivize the evaluation and tweaking of older programs which may in fact work better; policymakers do not need large-scale impact evaluations on what interventions work as much as they need other forms of research on why interventions work or how to improve them; a broader definition of evidence that includes citizen input can empower vulnerable populations; and formal mechanisms and expectations for dialogue are key – between sectors, between evidence users and producers, and between program designers and implementers.

 

“But Oxfam’s experience shows that it’s wrong to think that emotion and evidence are opposing choices.”
Oxfam researchers are using facts that stir up emotions, combining personal stories with policy recommendations, and experimenting with other approaches that use evidence to influence attitudes and policies.

 

What We’re Working On

 

Results for All is currently working with government policymakers and partners to explore the role that a global evidence network or platform could play in helping government policymakers to address the challenges they face in advancing evidence use across policies and sectors. Reply to this email if you would like to talk to us about your ideas or how to get involved.