Evidence-Informed Policymaking Reading List | November 2018 — November 6, 2018

Evidence-Informed Policymaking Reading List | November 2018

What to Read this Month
“If policymakers or those who support policy processes are interested in supporting evidence use and innovation during policymaking, they would be wise to consider amplifying the heterogeneity in a network by adding new actors from diverse backgrounds and finding ways for them to interact with other network actors.”
The authors explore the association between evidence use and policy network structure in Burkina Faso and suggest that heterogenous networks comprising different actors and organizations are likely to increase the use of evidence and policy innovation.
“Geospatial impact evaluations (GIEs) use precisely georeferenced intervention data and outcome data to establish a counterfactual retroactively, eliminating the need to randomly assign individuals, firms, or communities into treatment and control groups at the outset of a program.”
A short article describing how USAID used geospatial data to understand the impact of a $900 million investment in rural infrastructure in the West Bank/Gaza. The evaluation team found strong evidence that local economic output, as measured by remotely sensed nighttime light intensity, increased as a result of the rural infrastructure program. The authors highlight GIEs as a promising approach for rigorously evaluating programs in fragile states where the collection of baseline and endline data is challenging and costly.
“Nevertheless, the process has taken a relatively long time and has required a very substantial body of evidence generated from interventions reliably funded by donors motivated for change, the implementation of multiple knowledge transfers strategies, the efforts of collective and individual political entrepreneurs, effective and patient advocacy coalitions, aided by a major window of opportunity that they seized and used to good effect.”
The authors – a researcher and policymaker team – conducted a reflective analysis of a major change in health financing policy in Burkina Faso between 2008 and 2018, which they supported in their respective roles. They share practical lessons for strengthening evidence-informed decision making, including the importance of: persistent and consistent production of rigorous and useful knowledge; fostering early interaction and engagement between the research and decision making communities; understanding the political and socio-economic context in which decisions are made; and seizing windows of opportunity for change.
 
“Without these forms of documentation, population statistics – which inform a range of policy decisions – are incomplete at best, and wrong at worst. Many low-income countries base their poverty estimates on data that is more than a decade old.”
The authors make a compelling case for policymakers to invest in data that help inform critical decisions like where to build schools or direct medical resources, and to make that data publicly accessible so that communities and social entrepreneurs can help identify solutions that work. “Policymaking without high-quality public data is governing by guesswork,” they write.

“In the current context of low priority, and weak institutional support and technical capacity to enable evidence use in decision making and debate in African parliaments, the network’s activities respond to some of the key barriers hindering parliamentarians from using evidence in their work.”
The authors describe their research to understand the contribution of the Network of African Parliamentary Committees on Health (NEAPACOH) to the evidence ecosystem in African parliaments. Annual network meetings serve as a platform for sharing evidence and building demand and capacity for increased use, strengthening partnerships between MPs and researchers, as well as creating a sense of accountability and competition for following through on commitments made each year. An additional key insight from their research is the importance of creating a mechanism for sharing forward commitments made at annually held regional workshops with national parliaments to make better progress in realizing them.

 

What We’re Working On

 

How are public sector officials incentivized to use evidence routinely in their work, whether to inform major policies and decisions, design or alter programs, or guide implementation? Our new series highlights strategies that government agencies around the world have used to create incentives for using evidence in decision making. Take a look at our first five case studies from Mexico, Sierra Leone, and South Africa!

 

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.

 

 

Advertisements
Evidence-Informed Policymaking Reading List | October 2018 —

Evidence-Informed Policymaking Reading List | October 2018

What to Read this Month

“To put it bluntly, decades of stunning progress in the fight against poverty and disease may be on the verge of stalling. This is because the poorest parts of the world are growing faster than everywhere else; more babies are being born in the places where it’s hardest to lead a healthy and productive life. If current trends continue, the number of poor people in the world will stop falling — and could even start to rise.”
A great report on the world’s progress and likelihood of achieving the Sustainable Development Goals, with easy-to-digest graphs on key indicators, and stories behind the data about family planning, HIV, education, and agriculture. Take a moment to test your own knowledge with the interactive, six-question “data check” quiz on global poverty and population trends.

“UIS estimates that solid data on learning — gauging whether policies and programs are working, or reforms are needed — could improve education spending efficiency by 5 percent, generating $30 million/year in savings in the average country, paying for the assessments hundreds of times over.”
The author stresses that three years after the SDGs were adopted, there are still 100 countries with no data on student learning, and that two international literacy and math tests offer good standards for measuring progress toward global education goals.

“The average cost of an ambulance ride is $500, and in 2017 the County was able to avoid approximately 1,300 unnecessary rides — a health care system cost-savings of approximately $260,000.”
This case study documents how health and human services officials teamed up with fire and rescue leaders to address the rising volume of 911 emergency medical services calls from a small number of frequent callers. By sharing data across agencies, partnering with local hospitals, and providing home visits to some high-frequency callers, the County has seen a more than 50 percent reduction in 911 calls from the residents engaged in the initiative, saving public resources while proactively providing residents with the services they need. To learn more about how US local governments are using data and evidence to improve government performance and outcomes for residents, see additional case studies here.
 
“Only by finding out what doesn’t work — and being transparent about it — can we identify where money can be saved and re-invested in effective interventions.”
The author shares five lessons from his work as policy advisor for the UK What Works Network: 1) RCTs are not the only way of assessing impact; 2) it can be socially acceptable to experiment on children (see Education Endowment Foundation’s work; 3) it is equally important to learn from what does not work; 4) evidence use doesn’t happen on its own; and 5) short-term effects do not necessarily translate to long-term outcomes.

What to Watch

“The way an issue passes from a vague idea into a piece of legislation or concrete sort of proposal shapes the kind of research you do…You need to consider where you are in this policy funnel, and that should shape the research.” (See 15:00 to 17:30)
Duncan Green, Oxfam Strategic Advisor, discusses how to understand the power and change processes within the system you are trying to influence, how to think about framing and timing of the issue, and how to combine research with media campaigns and lobbying to have policy influence.

A short online course for public servants to help you understand how to craft a narrative about your policy idea; write a compelling policy article or blog; or build a great presentation about your policy or idea.

What We’re Working On

We recently returned from Pretoria, South Africa, where we attended the Evidence 2018 conference hosted by the Africa Evidence Network, AEN. You can read about the event on the AEN blog and see pictures here. During the conference, we led two sessions on the potential of networks and peer learning opportunities to support government policymakers in advancing the use of evidence in decision making. You can read some of our thinking on the subject in a recent Results for All blog post, here.

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.
Evidence-Informed Policymaking Reading List | September 2018 — September 10, 2018

Evidence-Informed Policymaking Reading List | September 2018

Our New Summary Report is Here!

 

Summary Report Screenshot
What happens when teams of government policymakers from nine countries meet in Nairobi, Kenya to discuss using evidence to improve policy implementation?

The peer-learning exchange we hosted from July 23-25 featured sessions on the use of administrative data to inform policy implementation, how to incentivize evidence use and build an evaluative culture, and how to engage citizens in data collection, among others.

Curious about what happened in Nairobi, and what we learned? You can watch videos, download the summary report, and explore the insights here.

 

What to Read this Month

“When asked what helped or hindered them from applying the learning to their work, most participants with experience in government described obstacles due to the culture and leadership of the institution.”

This two-part blog series follows the partnership between MCC and the Government of El Salvador. The first installment describes the evidence workshop for policymakers held in July 2016 to discuss and promote the use of findings from evaluations of previous MCC-funded programs in the country. The second installment highlights interviews conducted with the organizers and participants, considering the impact on Salvadoran policymakers, what has changed in the last two years, and remaining challenges to evidence-informed policymaking in the country. The blogs are also cross-posted on MCC’s website.
“So the dashboard contains data about every aspect of education other than whether it is accomplishing its purpose: to teach children the skills and competencies that prepare them to be successful adults.”
Using a metaphor about a 2009 plane crash, Pritchett examines educational performance in Indonesia and India and argues that dashboards with too much information – particularly if focused on inputs without clear connections to priority outcomes – can be worse than having no dashboard at all.

“It is common knowledge within academic circles, regularly re-echoed at EIPM forums, that research produced by such institutions mostly in the form of theses and dissertations do not in any way inform policy.”
The author shares his perspective on the state of evidence-informed policymaking in Ghana, including challenges and promising opportunities to improve access to policy-relevant research and data, and the role of knowledge brokers.

“Societal inequality is exacerbated when one in five children worldwide do not complete upper primary school; understanding their motivations to drop out is crucial.”

The article describes the findings of research conducted to understand the impact of an Indian government initiative to build latrines in schools. Latrine construction positively impacted educational outcomes like enrolment, dropout rates, and number of students who appeared for and passed exams, for both boys and girls. The author’s research also shows that school sanitation only reduces gender disparities with the construction of sex-specific latrines for older age girls, while unisex latrines are mostly sufficient at younger ages.

“In short, I feel that economists need to be cautious and modest when it comes to giving policy advice, let alone getting actively involved in ‘policy design.'”
The author notes that giving advice requires more than evidence, as public policy is a reflection of values and objectives. Further, the advice one gives depends on who one advises, with economists representing one of many potential views. Additionally, giving advice requires a familiarity with the implementation of policy, which most economists do not have, and should be regarded as a political act more than a scientific one, requiring collaboration and partnership with a broad range of stakeholders.

“Information from the government, distributed by the UN in the midst of the crisis, on the other hand, was ‘completely off’, Bengtsson said.”
After negotiating legal agreements with mobile phone operators to access location data, a Swedish NGO is able to determine where people go during a crisis, which can help governments and aid agencies prepare and respond.

What to Watch

Could using video help you tell your stories, convey emotion, reach a wider audience, and help turn your research into policy impact? This webinar discusses when to create videos, where to use them to reach the target audience, whose story to tell, and how to distribute them, and includes a case study video production and dissemination plan.

What We’re Working On

Just 13 days until we leave for Pretoria, South Africa, where we’ll attend the AEN Evidence conference and present a strategy for a peer learning network to support policymakers in advancing the use of evidence in government. If you’ll be in Pretoria between September 24 and 28, let us know! We’d love to see you there.

Do you have comments, questions, or ideas for us? We always love hearing from you! Contact us at info@results4all.org anytime.

Evidence-Informed Policymaking Reading List | August 2018 — August 7, 2018

Evidence-Informed Policymaking Reading List | August 2018

Header

Just Concluded: Peer Learning Workshop on Evidence Use

Last month we hosted a workshop for teams of government policymakers from nine countries, providing a peer learning forum to share experiences, lessons, and strategies for using evidence more effectively in the implementation of social policies. You can learn about the workshop here and read our initial insights and reflections in our latest blog post.

Final Workshop Image with @#


What to Read this Month

“Administrative data can do much more than help deliver services or provide inputs for monitoring. We can use administrative data also for learning and research in humanitarian emergencies if agencies make available their data for analysis as part of an ethical, secure and deliberate strategy.”
A look at the strengths, weaknesses, opportunities, and threats of using administrative data in the humanitarian field. The main strength of using administrative data is that it is available immediately at no cost and can be used for research and learning. A common challenge, on the other hand, is that it can be difficult to harmonize administrative data from different sources.

“Even a cursory look at the literature shows that evidence-informed policy making is about more than merely the availability of knowledge items. You have to go beyond uploading documents on a server, to also build and use high-trust relationships.”
In addition to sharing relevant, useful and reliable knowledge via its online platform, the new South African SDG Hub aims to strengthen partnerships between policy actors and researchers and support capacity building to improve the use of evidence for SDG-relevant policymaking.

“After more than a year of executing 10-week projects, they’re starting to identify city trends, and getting results: After analyzing data on the relationship between education and income, for example, they increased federal financial aid sign-ups in the city by 10 percent.”
Made up of city residents and government workers, the Urban Data Pioneers volunteer group helps collect and analyze data on the city’s population, housing, education, and infrastructure, to advance the Tulsa mayor’s campaign promise to use data to focus on results.

“Many RFPs include some type of evidence to explain the scope of the problem the solicitation seeks to address within the country context (e.g., statistics showing low school attendance, disease prevalence rates). That makes for helpful background reading, but it doesn’t get at the crux of the matter-whether the intervention USAID is soliciting will plausibly achieve the intended outcomes. Far fewer RFPs offered evidence for this.”
A thoughtful article on how USAID – and other major development donors – could restructure bidding and procurement practices to 1) incorporate evidence into its Requests for Proposals and suggested program interventions, and 2) prioritize awarding contracts to implementing partners that use evidence to inform their proposals and program designs, and demonstrate a commitment to building the body of evidence on what works and why in global development.

Our colleagues at Results for America recently published an index showcasing how state governments are using data and evidence in budget, policy, and management decisions to achieve better outcomes for their residents. Criteria include Data Policies / Agreements, Data Use, Evaluation Resources, Cost-Benefit Analysis, Use of Evidence in Grant Programs, and Contracting for Outcomes.


What We’re Working On

We’re still processing our learnings from the peer learning workshop we hosted last month on using evidence for policy implementation, and will share more insights, reflections, and takeaways in a final report in the weeks to come. We’re also preparing a satellite session at the AEN Evidence conference in Pretoria next month, where we’ll discuss our learnings from the workshop and other activities from the past year, and present a strategy for a peer learning network to support policymakers in advancing the use of evidence in government.

Do you have comments, questions, or ideas for us? We always love hearing from you! Contact us at info@results4all.org anytime.

Evidence-Informed Policymaking Reading List | June 2018 — June 1, 2018

Evidence-Informed Policymaking Reading List | June 2018

What to Read this Month

 

“In contrast to theories of change that posit that more rigorous evidence will have a greater influence on officials, we have found the opposite to be true.”
Excerpts from a paper reflecting on ten years of trying to improve public health in Guatemala. A key takeaway from the paper is the more community members were involved in generating and presenting evidence, the greater the likelihood that it would be used to address service delivery challenges.

 

“Our argument is that to improve policy execution we must go one step further and consider how policies can be more effectively designed by connecting actors vertically and horizontally in a process of collaboration and joint deliberation.”
This article describes how collaboration in policy design can facilitate adaptive implementation of policy solutions. For example, collaboration with front line staff can help government leaders better understand challenges faced at the service delivery level and propose context specific solutions, while collaboration with citizens can generate constructive feedback that stimulates learning and incremental adjustments to a policy solution. The article also discusses how open and democratic political contexts enable collaborative policymaking.

 

“Neither country policy makers nor the global development community are best served by a global flood of health estimates derived from complex models as investments in country data collection, analytical capacity, and use are lagging.”
A brief commentary illustrating how the development community’s emphasis on global health predictions and estimates can contribute to a false level of certainty about health status and trends, and importantly, detract from needed investments in improving data collection and analytical capacity in countries.

 

“Impact evaluations are an important tool for learning about effective solutions to social problems, but they are a good investment only in the right circumstances. In the meantime, organizations must build an internal culture in which the right data are regularly collected, analyzed, and applied to manage implementation and improve programs.”
A comprehensive summary of the authors’ recent book, in which they explain why impact evaluations are not always useful, describe ten scenarios in which alternatives to impact evaluations make sense, and provide four guiding principles for collecting data that will be used to inform decisions and improve programs.

 

“Within a year of the research getting published, and publicised by media and labour unions, the Maharashtra government raised the minimum wage to INR 12. This benefited 6 million labourers in the state-many, many times more than what we could have hoped to achieve if we had not adopted a research-based approach to the problem.”
Reflecting on his experiences as a medical physician and researcher, Dr. Bang discusses the importance of research for development, and the difference between research on the people and research for and with the people.

 

“Even the simplest intervention is context dependent in countless, subtle ways – it’s impossible to say with certainty how it will fare in another place. However, as presented here, there’s a four-step framework that can disqualify many mistakes before they happen, and improve the odds of replications you pursue.”
Useful insights, examples, and a framework to determine when an intervention can be replicated in a new context: assess the evidence, understand what conditions made the intervention work in the original context, ensure that the new context has those same essential conditions, and adapt the intervention as necessary.

 

African Research Organizations: Submit your Proposals by June 15

 

The Hewlett Foundation call for proposals aims to advance government use of evidence by supporting East and West African policy research organizations. They can apply alone or lead a project with 1-2 partner organizations. See more here.

 

What We’re Working On

 

We’re busy organizing “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers,” which we are hosting in partnership with the African Institute for Development Policy (AFIDEP) and IDinsight from July 23-25 in Nairobi, Kenya.

 

The workshop will provide a forum for government teams from 9 countries to learn about and share experiences, challenges, good practices, and key lessons on how to use different types of evidence to overcome roadblocks, create political buy-in, engage with stakeholders, and mobilize financial resources to support and improve policy implementation. We selected 10 great teams from over 55 applications. A big thank you to everyone who applied and helped to spread the word! More info coming soon.

 

Evidence-Informed Policymaking Reading List | May 2018 —

Evidence-Informed Policymaking Reading List | May 2018

New Opportunity for African Research Organizations

On May 1, the Hewlett Foundation launched a call for proposals for African Policy Research Institutions to Advance Government Use of Evidence. The call is meant to support policy research organizations in East and West Africa in their efforts to advance evidence-informed policymaking, with a focus on strengthening policymakers’ capacity, motivation, processes, and relationships to enhance evidence use. African policy research organizations can apply alone or lead a project with 1-2 partner organizations. Submit clarifying questions about the call for proposals to EIPAfrica@hewlett.org by May 8. The deadline for first-round proposals is June 15. Learn more at https://www.hewlett.org/eipafrica/.
May EIP Reading List Photo

How can you ensure that evidence used to inform policy is appropriate, credible, and transparent? What types of evidence are appropriate for each stage of the policy cycle, including design and implementation? New Zealand’s Social Policy Evaluation and Research Unit, Superu, answers these questions and more in their latest guide.

“We should be more ambitious than just making ourselves (and the experts we work with) useful to the various groups we hope might use the research and policy analysis our institutions produce.”
It’s important for researchers to package and communicate information in a way that is accessible to policymakers, but it’s even more critical to invest in longer-term partnerships with policymakers to truly influence social change.

“Both making evidence accessible and facilitating processes for deliberating evidence were essential in supporting evidence users to understand the extent and usefulness of evidence and identify implications for policy, practice and services.”
This paper describes a process developed by the Centre for Research on Families and Relationships in the UK, to facilitate the use of evidence in practice. The process helped practitioners identify gaps in knowledge; facilitated an evidence review to refine the research question, find, synthesize, and report on evidence using tools and templates; and concluded with a discussion to plan for the use of evidence.

“Governments have used the phrase ‘let a thousand flowers bloom’ to symbolise a desire to entertain many models of policy-making.”
Evidence-based decision making can take different approaches, ranging from implementation science driven by RCTs, to storytelling that draws on practitioner and user feedback, to improvement methods that draw from a combination of experience and operational evidence. The challenge is knowing what type of evidence to use given the complexity and politics of the policymaking process.

“The establishment of Science and Technology (S&T) fellowship programs in other states could greatly increase evidence-based policy-making and not only benefit state policy-makers but also help to inform national policy-making and society as a whole.”
The editorial reflects on the success of California’s S&T Policy Fellows Program, which selects and trains individuals with a science or technology background to serve as full-time legislative staff for a year. The Fellows are in high demand, and as a result of the program the use of evidence in state policymaking has improved dramatically.

What We’re Working On

We’re partnering with the African Institute for Development Policy (AFIDEP) to host a peer learning workshop on Using Evidence for Policy Implementation from July 23-25 in Nairobi, Kenya. Teams of 3-4 policymakers from governments around the world will be fully-funded to participate in the workshop. Participants will work in country teams to discuss and diagnose the root causes of policy implementation challenges, create solution-based roadmaps, and exchange experiences and lessons on evidence use in policy implementation across different contexts.

There is only 1 week left to apply! Please find more details and application instructions here.

 

Evidence-Informed Policymaking Reading List | April 2018 — April 2, 2018

Evidence-Informed Policymaking Reading List | April 2018

What to Read this Month

Advancing Evidence-Informed Policymaking: What’s Culture Got to Do With It? | Abeba Taddese, Results for All
“It takes time and intentional effort to build or change the evidence culture in government. And to truly do so, we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work.”
We often hear from policymakers and partners about the importance of building or shifting the culture of evidence use, but what does that mean? Our latest blog explores the notion of an evidence culture and how to support it.

 

Ian Goldman is Acting Deputy Director General and Head of Evaluation and Research in South Africa’s Department of Planning, Monitoring and Eva
luation (DPME). His presentation, made at the Africa Evidence Forum in Nairobi last month, shares challenges and lessons for evidence use in the policy process. See his plea to researchers on the last slide!

 

“Because knowledge mobilisation means different things to different people it can also be difficult for new knowledge mobilisers to identify and clarify their role and communicate this effectively. This increases the risk of misunderstandings and misalignment between knowledge mobilisers and those they are working with.”
This article offers a practical framework based on a series of questions that can help knowledge mobilizers better understand their role: Why is knowledge being mobilized? Whose knowledge is being mobilized? What type of knowledge is being mobilized? How is knowledge being mobilized?

 

“Often, innovation starts in the field, and practice gradually influences policy over time. That means it is as important for research uptake to get communities of practice to engage with research.”
Learnings from Secure Livelihood’s Research Consortium’s work in DRC emphasize the need to focus not only on decision-makers in the research uptake equation, but to also consider the role of researchers. The Consortium has learned that to improve research uptake, it is important to strengthen institutional research capabilities as well as individual research skills, and to emphasize the value of research early, for example in higher education curricula, while also building relationships with policymakers.

 

“Rather than making large-scale decisions in the absence of fully conclusive or satisfactory evidence, experimentation can help to “de-risk” the decision-making process at this step of the policy cycle.”
This article encourages governments to consider small-scale experiments as a means to test new approaches and move incrementally toward larger evidence-informed policies and programs. It highlights what the government of Canada is doing in this regard, and also includes a nice description of how experimentation could be incorporated in each step of the policy cycle.

 

“Data and evidence can overcome such myths, but try and use government data – policy makers are more comfortable when you use data they know and trust.”
The Private Secretary to former President Kikwete in Tanzania shares insights on who to target, when to act, and how to communicate the message to improve your policy advocacy.

 

What We’re Working On

 

We’re putting the finishing touches on our network mapping report, in which we assess the demand for a global evidence network, and review 50+ existing networks for peer learning and evidence use in government. We also synthesize 13 lessons on network organization, engagement, and measurement. Thank you to everyone who shared insights and feedback! We look forward to publishing the report in the next few weeks.
Evidence-Informed Policymaking Reading List | March 2018 — March 5, 2018

Evidence-Informed Policymaking Reading List | March 2018

What to Read this Month

“Although the awareness and interest in evidence-informed policymaking has gained momentum in Nigeria, meeting points, such as policymakers’ engagement events to consider issues around the research-policy interface related to MNCH, are essentially lacking.”
This article summarizes the barriers to and facilitators of evidence-informed health policymaking in Nigeria and highlights a need to strengthen capacity at the individual and organizational levels (skills, systems, and processes) to facilitate the evidence to policy process.
“However, while there is a problem when civil servants don’t understand statistics or how to weigh up which evidence sources are reliable, fixing these problems won’t help improve policy if there is no political space to bring evidence into decision making, and no incentives for senior decision makers to care about evidence.”
The final, independent evaluation of the DFID Building Capacity to Use Research Evidence (BCURE) program found that partners were more successful when they went beyond building individual skills and capacities, and engaged with politics and incentives to create changes in the organization and wider environment.
“Many governments collect vast amounts of data, but need support to organize it in a usable format and identify use cases to improve program management.”
Through its government partnerships in Brazil, Chile, Colombia, and Peru, J-PAL has found that in addition to great leadership, the following strategies help support evidence-informed policymaking in government: making it someone’s job to use evidence, helping governments make better use of the data they already collect, investing in long-term partnerships, and taking on quick wins that can build trust and demand for evidence.
“Policy narratives are ‘the communication vehicles that are used for conveying and organizing policy information’, and the development of policy can be understood as a ‘narrative-making’ process.”
This article describes how education policymakers in Australia commonly referenced policy narratives when describing evidence use in their policy development process, and how these narratives in turn helped the research team better make sense of the different ways in which policymakers were using evidence. The article suggests that examining how evidence informs construction, testing, and communication of policy stories or narratives may be a promising and less abstract approach for understanding evidence use in policy development given how prominently narratives and stories feature in policymaking.
“There is an increasing need to develop faster and more reliable learning processes to solve problems and, at the same time, strengthen the trust between citizen and policy institutions. Policy experiments are a great way to do this.”
In this interview, Mikko Annala, Head of Governance Innovation at Demos Helsinki, a Finnish independent think tank, shares three elements that are key to building a culture of policy experimentation: incentives that mandate and support evidence production, experimentation, and failure; committed leadership; and a focus on getting results.
And the results are in! Last month, we used this A/B testing tool from ideas42 to test two different subject lines for this monthly email to see which one got a higher open rate (the percentage of people receiving the email who actually opened it).
  • Version A: Evidence-Informed Policymaking Reading List | February 2018
  • Version B: Which program works best? How do evidence maps inform policy? Is results-based aid more effective?
The Version A and B email blasts had open rates of 25.2% and 20.1%, respectively. The good news is that both rates are higher than the industry average for nongovernmental organizations! However, although Version A looks more effective than Version B, with a p-value of 0.156, the results were not significant at the 10% level. To yield a statistically significant result (to be more confident about which subject line attracts more interest) we could try this test again with a larger sample size. That means we need to send our reading list to more people! If you know anyone who would like to receive this monthly email on evidence-informed policymaking news and research, please forward this message and encourage them to sign up here: https://results4america.org/results-for-all-reading-list/

 

Thanks for reading!
Evidence-Informed Policymaking Reading List | February 2018 — February 7, 2018

Evidence-Informed Policymaking Reading List | February 2018

What to Read this Month

Enhancing Evidence-Informed Decision Making: Strategies for Engagement Between Public Health Faculty and Policymakers in Kenya | Nasreen Jessani et al, Evidence & Policy
“It would behove policymakers in the Kenyan government to lead the change with respect to outreach and inclusion of academic researchers in policy deliberation.”
This article explores the interactions between academic knowledge brokers and health policymakers, and concludes that a combination of personal relationships and institutional partnerships are needed to facilitate engagement.

Is Results-Based Aid More Effective than Conventional Aid? Evidence from the Health Sector in El Salvador | Pedro Bernal et al, Inter-American Development Bank
“Using a difference-in-difference approach and national health systems data we find that preventive health services increased by 19.8% in conventional aid municipalities and by 42% in RBA [results-based aid] municipalities compared to national funds, suggesting that the results-based conditionality roughly doubled aid effectiveness.”
This study is one of the first to measure the effects of results-based financing on the quality of public services provided by government municipalities.

Policy Relevant Evidence Maps: A Method to Inform Decision Making in the Public Sector (Webinar) | Carin van Zyl & Laurenz Langer, GESI
“DPME’s policy-relevant evidence maps follow a rigorous and transparent research process.”
This webinar outlines the steps used in South Africa’s Department of Planning, Monitoring, and Evaluation (DPME) to construct an evidence map to inform decision making in the area of human settlements, and how the process can be adapted elsewhere.

The What Works Network Five Years On | UK Government
“We have hugely talented public sector leaders, but we can still do more to make the best evidence available to them, and to ensure that the time and money invested in our public services are used to the best possible effect.”
In the last five years, the ten What Works Centers in the UK have produced or commissioned 288 evidence reviews used to improve public services; other activities highlighted in this report include publicizing evidence gaps, creating evidence comparison toolkits, and conducting evidence use audits of government departments.

Increasing the Use of Data and Evidence in Real-World Policy: Stories from J-PAL’s Government Partnership Initiative | Samantha Carter & Claire Walsh, J-PAL
“When governments decide to use data and evidence to improve policy, the results can be powerful.”
During its first two years, J-PAL’s Government Partnership Initiative (GPI) has supported 28 partnerships in 15 countries, helping to scale-up effective programs, and improve systems for data and evidence use.

Unsure which version of a program works best? Check out this A/B testing tool from ideas42
“A/B testing works by using an experimental procedure that provides different versions of parts of a program – such as a letter, a web site, or step in the process – to people at random. Statistical analysis can confirm which version is working better and by how much.”
Businesses constantly experiment with A/B testing to refine products and services and how they are marketed, but increasingly, governments are using the strategy to identify which of two service options citizens prefer, and how to communicate messages and create behavior change. This tool can help you prepare two versions to test and what to measure, randomly assign the versions between two groups, and analyze the results to determine which version works best.

Evidence-Informed Policymaking Reading List | January 2018 — January 8, 2018

Evidence-Informed Policymaking Reading List | January 2018

What to Read this Month

 

“Delegating brokerage to specially designated individuals makes mobilization of knowledge into action highly contingent on their individual preferences, connections and skills.” 
Knowledge brokering is a multidimensional process that should be considered a core function of an organization and carried about by multiprofessional teams, including academics, policymakers, users, managers, and practitioners, rather than expert individuals or intermediary organizations.

 

“Better policy requires being both honest about our goals and clear-eyed about the evidence.”
To be evidence-based, health policies must 1) be well-specified, 2) be clear about which goals they are meant to achieve, 3) demonstrate empirical evidence of the magnitude of their effects in furthering those goals.

 

“Overall, leaders use data or analysis more to conduct retrospective assessments of past performance than inform future policy and programs.”
In a 2017 survey, 3500 public officials and development practitioners from 126 low- and middle-income countries tended to favor data that 1) reflected the local context, 2) drew upon data or analysis produced by the government, and 3) went beyond diagnosing problems to provide concrete policy recommendations.

 

“It is difficult to justify the resources, risks, and opportunity costs of PFS initiatives when an intervention has no evidence base or when existing evidence raises red flags about the impact of a program.”
The article stresses that Pay-for-Success – in which a government repays independent investors if the initiative they financed achieves pre-determined, socially desirable outcomes – will only be effective if used to finance interventions that meet 7 criteria, including a strong evidence base, cost savings for the public sector, clearly defined metrics, and a reasonable time frame.

 

“The evaluation findings reinforce the wider understanding that the demand forevidence varies substantially depending on individual policymaker attitudes, perceptions about the usefulness of evaluation evidence and credibility of the evaluator, awareness of evaluation benefits, technical skill in evaluation methods and the nature of the political system.”
Lessons learned from the Demand-Driven Evaluations for Decisions (3DE) pilot program in Uganda and Zambia note the program’s limited contribution to evidence-based policymaking capacity and behavior in both countries, and highlight the importance of strengthening capacity in evidence-based decision making within government and of considering the wider political economy in program design.

 

“Of course, every civil servant need not be a data scientist – but they should appreciate its potential to improve the lives of citizens, help services function more efficiently, and cut costs.”
A helpful compilation of examples and resources to learn more about data literacy, artificial intelligence, design thinking, and behavioral insights – and use them in government.

 

What We’re Working On

 

Results for All continues to work with government policymakers and partners to explore the role that a global evidence network could play in helping champion evidence use across public policies and sectors. We have completed our first phase of research and interviewed nearly 50 stakeholders to date – ranging from government policymakers, to NGOs leading evaluation activities, to existing networks where there may be opportunities to collaborate – and will summarize and share what we have learned in the coming months.