Evidence-Informed Policymaking Reading List | December 2017

What to Read this Month

“Estonia is the most advanced country with regard to seamless data exchange. Its State Information Agency has mapped all data owned by the national government and provides a standardized technical environment, called the X-Road platform, for secure information sharing with all users in the public and private sectors.”
Offering public services via digital platforms can help governments increase productivity and decrease spending; doing so requires a government-wide digital strategy, an IT platform shared across government departments, and rules governing the use of data accessible through the platform.

“While most purveyors are working to ensure their EBPs are effective and replicable, most are not working to expand their reach.”
This research is focused on purveyors – organizations that take on the job of spreading evidence-based programs. It identifies a lack of resources, expertise, and incentives as key barriers to the spread of evidence-based programs. Successful expansion of programs is often due to external forces such as foundation and government investments, public systems change, and field building, that help to create a demand for services.

“Despite several decades of work on evidence informed policy, the goals to improve evidence uptake and promote greater use of evidence within policy making are still elusive.”  
This paper identifies organizations, systems and infrastructure, access to and availability of evidence, and the interactions between researchers and policymakers as key determinants of evidence use. It recommends strengthening networks and relationships to more optimally inform health policy. 

“Policymakers showed great sensitivity to the approach of individuals who present research results. National or regional presenters were generally preferred over international presenters-several interviewees pointed to the importance of peer learning and the influence of regional ‘champions’ for certain issues.”
An INASP investigation found that policymakers in eastern and southern Africa prefer when HIV prevention evidence is presented in clear and brief memos or PowerPoint presentations, provided alongside a series of face-to-face interactions throughout the research process.

“So many important decisions and policies around the world are based on instinct, inertia, ideology or ignorance (four Is) rather than data or rigorous evidence.”
The second part of a three-part series describing the strong partnerships J-PAL has built with governments and policymakers in India, to advance the use of evidence in policymaking.

“The Lab’s mission is to embed the scientific method into the heart of city operations to provide decision-makers with high-quality evidence that they can use to achieve better outcomes for D.C. residents.”
A quick read on how policy labs embedded within government, like the Lab in Washington D.C., can conduct low-cost interventions that help empower policymakers to use data and evidence to improve outcomes for residents.

“We call this ‘zero-credit’ politics: a policy problem can persist because politicians are unable to claim credit from working to solve it.”
The public trusts doctors more than they do politicians, who as a result, don’t ask too many questions and focus instead on preserving their reputation. The partisan competition in the U.S. is another barrier to evidence-based medicine.
                                                                                                                        
“The obsession with the phrase: ‘bridging research and policy’ can be misleading and distracting. There are other relationships that should be strengthened if policy is ever going to be better informed.”
Lessons from the 2017 Latin American Evidence Week include: overly focusing on new innovations can disincentivize the evaluation and tweaking of older programs which may in fact work better; policymakers do not need large-scale impact evaluations on what interventions work as much as they need other forms of research on why interventions work or how to improve them; a broader definition of evidence that includes citizen input can empower vulnerable populations; and formal mechanisms and expectations for dialogue are key – between sectors, between evidence users and producers, and between program designers and implementers.

“But Oxfam’s experience shows that it’s wrong to think that emotion and evidence are opposing choices.”
Oxfam researchers are using facts that stir up emotions, combining personal stories with policy recommendations, and experimenting with other approaches that use evidence to influence attitudes and policies.

What We’re Working On

Results for All is currently working with government policymakers and partners to explore the role that a global evidence network or platform could play in helping government policymakers to address the challenges they face in advancing evidence use across policies and sectors. Reply to this email if you would like to talk to us about your ideas or how to get involved.
Advertisements

Welcome to the first edition of our reading list on evidence-informed policymaking!

On the first Monday of every month, we’ll be posting a small selection from our favorite readings, each with a quote and short summary. We’ve noticed a growing body of work on the use of evidence in policy and practice, and hope you will find this compilation a useful way to follow the research, discussions, and insights.
Happy reading!

What to Read this Month

“One of the key lessons is that we need to better understand politics and incentive structures in government organisations when promoting evidence uptake.”
A recent DFID program distinguishes three types of evidence use by policymakers: Transparent Use, Embedded Use, and Instrumental Use. See the table for definitions and examples.

“When we asked survey respondents what barriers they faced in relying on evidence to make decisions, they answered that the most serious barriers were a lack of training in both data analysis and in how to evaluate and apply research findings.”
Civil servants in Pakistan and India also reported that they did not have enough time or incentives to consult evidence before making decisions.

“As well as the time, money and conservation opportunities wasted on ineffective projects, we must consider the possibility that conservation as a whole will be seen as unjustifiable if money is regularly spent poorly because of a lack of evidence use.”
Factors contributing to evidence complacency include a lack of time or training needed to consult scientific evidence, a feeling that relying on evidence reduces professional autonomy to make decisions, and a view that people are more accessible sources of information compared to scientific resources.

“At the same time, because the issues of adolescents are multiple – including education, economic, health and security, to name a few – a cross-sector approach is needed in conducting research and gathering data.”
India is home to 253 million adolescents, or 21% of the global adolescent population. Addressing the needs of such a huge population requires a huge amount of, and commitment to, data and evidence across a variety of social sectors.

“Policy making is not a series of decision nodes into which evidence, however robust, can be ‘fed,’ but the messy unfolding of collective action, achieved mostly through dialogue, argument, influence, conflict and retrospectively made sense of through the telling of stories…”
A good reminder of the complexity, messiness and unpredictability that characterizes the policymaking process.

“The data that a policy index reveals are always in the past, but the impact of a policy index is in the conversations that it informs and the issues that it helps to advance in the future.”
The National Arts Index (NAI) developed by the nonprofit organization Americans for the Arts was the first of its kind and an inspiration for other index projects in the arts internationally. Lessons learned highlight the need for a targeted communications strategy to build awareness about the value of a policy index and the large amounts of time and resources required to build and maintain one.

Looking for even more? See Results for America’s 2017 #WhatWorks Reading List, featuring numerous articles on what US cities, states, and federal agencies are doing to use more evidence to get better results.

What We’re Working On

Results for All is currently working with government policymakers and partners to explore the role that a global evidence network or platform could play in helping government policymakers to address the challenges they face in advancing evidence use across policies and sectors. Reply to this email if you would like to talk to us about your ideas or how to get involved.

For the latest updates, you can follow us on Twitter here. And don’t miss Results for America’s recently released 2017 Federal Invest in What Works Index, which highlights the extent to which 8 US federal departments and agencies have built the infrastructure necessary to use evidence when making budget, policy, and management decisions.

We’ve noticed a growing body of work on the use of evidence in policy and practice. To help you follow the research, discussions, and insights, we’ll send you a small selection of our favorite readings on the first Monday of every month. The reading list will also include our own new blog posts, and highlight what we’re working on. In addition to following us on Twitter, signing up for the reading list is the best way to follow what we’re doing.

Sign up for our Monthly Reading List today here!

How do governments use evidence for policy? 100+ mechanisms and a short survey

Results for All is currently assessing whether a global evidence network that facilitates collaboration and an exchange of experiences between policymakers could help to advance and institutionalize the use of evidence in government. We invite you to participate by taking our short survey, here.

The survey will take less than 10 minutes, and will close on October 31.

If you are not a government policymaker, you can still click on the link above and provide input in the space provided. Additionally, we encourage you to contact us at any time to learn more about our work.

We appreciate your support in forwarding the survey to other government policymakers who can help us to assess the demand for a global evidence network.

The Global Landscape Review is here!

Results for All’s just-released “100+ Government Mechanisms to Advance the Use of Data and Evidence in Policymaking: A Landscape Review” and case studies on Ghana, Kenya and Canada can be downloaded here

By Abeba Taddese, Executive Director, Results for All

For the last 18 months, Results for America’s global Results for All initiative has been engaged in a landscape review to understand the different approaches governments are taking to create formal strategies and mechanisms – specifically, policies, programs, platforms, systems and operational practices – to advance and institutionalize the use of data and evidence in decision making.

We’ve had a fulfilling year of learning from government leaders, experts and citizens around the world, and we are eager to share some of our insights here:

  • The last 5 to 7 years have been a busy time for governments. Outside of the long-established evaluation systems in countries like Mexico, Colombia and Chile, we observe that many of the formal structures governments are putting in place to support evidence-informed policymaking (EIP) are quite recent. Separately, we note a growing body of literature on evidence-informed policymaking, notably exploring constraints or barriers to EIP, and factors that enable data- and evidence-driven decision making.                                                                                                                           
  • While institutional strategies and mechanisms are necessary and often a precondition for routine and consistent use of data and evidence in policy and programs, they aren’t enough on their own. There is widespread agreement among policymakers and evidence producers alike, that policymaking is complex, multi-dimensional, and influenced by many factors. It is far from a linear “evidence in, policy out” process. Contextual factors ranging from leadership, commitment and allocation of resources to political climate, values and belief systems are critical influences in any policy process.                                                                                                                                                                                                                                          
  • Governments are taking different, context-specific approaches to creating formal strategies and mechanisms. And they are sharing information about their processes and learning from each other. The study tours to Mexico, Colombia and the United States that helped to inform South Africa’s monitoring and evaluation system, the data-driven community safety approach in Saskatchewan, Canada (Hub) adapted from Scotland’s Violence Reduction Unit model, and the collaboration between the Office of the Prime Minister in Uganda and Malaysia’s Performance Management and Delivery Unit (PEMANDU) are a few examples that stand out.                                                                                                                                        
  • Ultimately, EIP isn’t about a specific approach or type of evidence, but rather finding context-appropriate ways to make better use of data and evidence in real-life policy and program decisions. This last point is worth underscoring, in the spirit of ensuring that we don’t end up with a jargon-laden theoretical field that distracts the EIP community – whether government actors, nongovernmental organization partners (NGOs) or the philanthropic community – from the end goal of achieving better outcomes for populations.                                                                                                                                                                                                                               
  • There appears to be an emphasis in government on creating structures and systems to improve access to data and evidence, while NGOs are playing a more central role in facilitating partnerships between policymakers and evidence producers as well as building the individual capacity of policymakers. Our review is not exhaustive or definitive, so we can’t say for certain why this might be the case. But we surmise that there may be a “build and they will come” approach to the use of data and evidence in government policymaking, and that governments may prioritize spending of finite resources on tangible infrastructure. For NGOs, partnership building and training activities often offer less bureaucratic and politicized entry points for supporting government efforts to advance EIP.

The landscape review is accompanied by resource tables and a series of case studies on evidence-informed policymaking training in Ghana, demographic dividend policies in Kenya, and a community safety strategy in Canada. Our goal for this body of work that identifies more than 100 strategies and mechanisms for advancing the use of data and evidence in government policy and practice, is to promote a sharing of experiences and lessons learned among leaders in government, NGOs and other partners.

We’ll be building on this work in the months ahead, and close with a few questions we hope to explore further:

  • How effective are government strategies and mechanisms in promoting the use of data and evidence? Are there approaches that are more effective than others in improving the use of evidence, and that ultimately have the greatest impact in achieving development objectives?                                                                        
  • How can governments be best supported in their efforts to institutionalize the use of data and evidence? Could structured joint learning and networking approaches help to accelerate the adoption of strategies and mechanisms for advancing the use of data and evidence?

We are grateful to the experts interviewed for this review, who contributed their time and input (you can find many of them listed in Appendix 2 of the report), and to the William and Flora Hewlett Foundation for generously supporting this work.

We encourage you to continue visiting the Evidence in Action blog for updates. If you have questions or would like more information, please contact me at Abeba@results4all. And please share your feedback with us by tweeting at @resultsforall with the hashtag #GlobalLandscapeReview.

Thanks for reading!

 

Observations from the Outgoing Executive Director

By Karen Anderson

In January 2015, we launched Results for All with a deep curiosity about how governments around the world are using data and evidence to drive outcomes. What policies, programs and practices are being used, how are they being instituted, and who are the champions for evidence use? Do political appointees drive evidence-informed policymaking in the executive branch, the civil service, or is the push coming from the legislative branch?

Based on our experience in the United States, through our work with Results for America, we knew that the answers would be mixed. In our case, much of the innovative work to promote the use of data and evidence is happening at the local level, with mayors and county executives understanding the need to produce more for their constituents with fewer resources. At the federal level, the G.W. Bush and Obama presidential administrations both had deep commitments to evidence-informed policymaking, instituting programs and practices that laid the groundwork for more rigorous data collection, program evaluation and outcomes-focused budgeting.

We began exploring the global evidence landscape through our work by organizing Evidence Works 2016: A Global Forum for Government, an event we co-hosted with Nesta’s Alliance for Useful Evidence. Bringing 140 policymakers from 40 countries together – from Australia, Africa, Asia, Latin America, North America and Europe – we learned about the very significant work underway in a variety of contexts – from challenges to solutions, lessons learned and best practices.

edit 018
Outgoing Results for All Executive Director Karen Anderson (center) talks to two participants at the 2016 Evidence Works Forum for Government in London.

The learnings from Evidence Works 2016 served as a foundation for additional outreach and research we conducted for the landscape review of government mechanisms to advance the use of data and evidence in policymaking. This review, which will be released later this month, is the culmination of 18 months of conversations, interviews and country visits to learn more about ways in which governments around the world are institutionalizing the use of data and evidence in decision-making. Coupled with an extensive literature review, we’re confident that we’ve captured a range of examples that showcase what governments are doing to promote evidence-informed policymaking. Our hope is that this will be a useful resource that can be improved with additional knowledge and input over time.

As we finalize the landscape review, what have we learned? The short answer is that we’ve learned more than we thought possible. But here are some of my personal observations:

  • The evidence movement is relatively young and truly global. In the last five to seven years, policymakers at all levels of government and in all parts of the world have been implementing policies, platforms and practices to incorporate data and evidence into decision making. The diversity of examples will be surprising to many readers.

  • There is no single or best type of evidence. Governments are different and need a diversity of approaches for tackling their challenges. From data analytics to behavioral insights to impact evaluation, there is a broad evidence spectrum and a need for tools and resources to promote uptake across that spectrum.

  • There is a general disconnect between evidence producers and evidence users that needs to be addressed. A number of organizations and academic institutions are working to address problems of knowledge translation, and to sensitize researchers to the need for timely, relevant evidence that meets the demands of decision-makers. At the same time, governments are building skills and capacity to use outside sources of evidence that they deem credible and trustworthy. While progress is being made to close the gap, more work needs to be done, and this is a barrier to evidence use that exists in the north and south, and at all levels of government.

  • Evidence-informed policymaking can only occur if there is a sustained demand for evidence. Producing evidence in a timely and accessible manner is a first step, but without demand from policymakers for evidence, there is little chance that it will be used. In some cases, internal champions can start a movement and even build networks of support within government for evidence-informed policymaking. In other cases, outside organizations have led the charge, with direct advocacy campaigns and through building public support for evidence. This is a key area where governments can continue to learn from each other about what works and share experiences that can help propel the evidence movement forward.

  • Having the right mechanisms in place to promote evidence-informed policymaking is critical. The landscape review focuses primarily on the infrastructure, policies and practices that strengthen government’s ability to use data and evidence. We highlight the four key conditions that enable the use of data and evidence at a government or institution level: (1) commitment, (2) allocation of resources, (3) incentives, and (4) a culture that supports learning and improving. In addition to technical support, information sharing and networking can help build and strengthen capacity and know-how; we shouldn’t underestimate the power and value of peer-to-peer learning in driving the evidence agenda forward.

It has been an immense pleasure to help launch Results for All and to explore the global evidence landscape. I’ll look forward to the reactions to the landscape review and to keeping in touch with Results for All during its next phase of work.

I’m delighted to announce that Abeba Taddese, who currently serves as the Program Director for Results for All, will take over as Executive Director on July 1. I have accepted a position with the University of Chicago’s Becker Friedman Institute for Economic Research, once again focusing on evidence production — helping University of Chicago economists produce accessible and relevant research that can inform the public debate.

Thank you for welcoming us into the global evidence community and I hope that our paths cross again soon. You can continue to reach Abeba at Abeba@Results4All.org, and you should look for communications around the landscape review in the near future.

 

Improving the use of knowledge in policy – an opportunity for leading public agencies

By Vanesa Weyrauch 
Co-founder of Politics & Ideas 
.
Are you a leader in a government agency eager to improve the use of knowledge in policy?
.
INASP and Politics & Ideas invite government agencies[1] to participate in an opportunity to improve the use of knowledge in policy through the application of a new diagnostic tool. This tool can help agencies clearly understand the current state of knowledge production and use to inform policy, identify windows of opportunity for change, prioritize areas for improvement and co-design feasible change plans.

This call is looking for committed change makers that are eager to initiate a process of change in their organizations or support a process that has already started. Applications are welcome from:

  1. Individual government agencies
  2. joint proposals from government agencies and local policy research institutions (think tank, university centre, policy research institute, etc.)

The selected government agencies will receive:

  • A comprehensive and systematic diagnosis of their production and use of knowledge to inform policy
  • A document with prioritized areas for change
  • A tailored change plan, with concrete activities and methods to address the prioritized areas of change

How to apply

Please read the full terms of reference before applying for this call:

The deadline for application is: 4pm (GMT+1), 9 July 2017.

For any questions please contact crichards@inasp.info


[1] Applications are invited from organizations based in low and middle income countries in Africa, Asia, Central and Eastern Europe and Latin America (as classified by the World Bank).

This post originally appeared on Politics & Ideas on June 12, 2017 and can be found at: http://www.politicsandideas.org/?p=3737

VakaYiko learning exchange inspires exhibition of evidence products in the Parliament of Ghana

Author, Kirchuffs Atengble, Programme Coordinator (VakaYiko), Ghana Information Network for Knowledge Sharing (GINKS)

“I encourage new entrants to prepare and present statements on any issue of interest. Apply the Question Time well. Your brilliant visibility will affect your re-election. I will meet with leadership on this and seek support of the leading Think Tanks in Ghana to help you deliver. A comprehensive mentoring process is vital for improved performance”. – Rt. Hon. Prof. Mike Ocquaye (Speaker, Parliament of Ghana)

The desire for developing adequate capacity for the uptake of research and other evidence has driven the Parliament of Ghana to enter into partnerships in this regard. And the above statement from the Speaker of the seventh Parliament of the Republic of Ghana, in his inauguration reaffirms the legislature’s need for evidence in its deliberative function as a major public policymaking institution in Ghana.

This blog post traces the inspiration for innovations within the information support system of Parliament (comprising the research, library, ICT, Hansard and Committees departments) and makes the case for collaboration among institutional support partners, including the Westminster Foundation for Democracy (WFD) and the VakaYiko consortium.

Picture1

A brief background

A review of the information support system of the Parliament of Ghana by VakaYiko consortium, under the leadership of INASP (a UK-based charity organisation), found that there was little coordination around the request for and supply of evidence within the legislature.

Continue reading “VakaYiko learning exchange inspires exhibition of evidence products in the Parliament of Ghana”

What Works Media Project Shows the Power of Film to Tell Evidence Stories

.
Last month at Bloomberg Philanthropies’ What Works Cities Summit in New York City, Results for America’s What Works Media Project premiered its first film, showcasing the power of storytelling to highlight how data and evidence can improve policymaking.
.
The 7-minute documentary film focuses on Seattle‘s efforts to address homelessness, building on their work with What Works Cities to build technical capacity, enhance open data systems, and improve the City’s performance management system.
.
The full documentary film can be found here. The next films in the series will focus on efforts to use data and evidence to improve outcomes in areas such as workforce development and early childhood education.

An In-Depth Look at Open Data in Quito

Q&A with Carolina Pozo, Director and Co-Founder WONDER Social Innovation Lab, Former Director and Co-Founder LINQ Public Innovation LAB

We asked Carolina Pozo, the Former Director and Co-Founder LINQ Public Innovation LAB in Quito, Ecuador, about her work to promote open data systems in Ecuador.

1.  Please tell us about your work to promote open data platforms.

I launched the first open data platform in Ecuador, for the City of Quito in 2014. In the pursuit to build on an open government governance model, which promotes transparency, collaboration and citizen participation, having an open data platform is a key step in the process. The implementation of the technology was efficiently done in less than three months, however any open data platform has to be accompanied by a communication strategy. The internal and external buy-in of a new initiative, unknown by the majority, required great deal of lobbying within the city government officials and great investment in communicating citizens and other external stakeholders of the value and use of open data. Our approach started with 350 data sets and maps which included statistics and demographic data, open budget, open contracting and real time data. Open data is not only transparency, is should provide useful data for citizens- on a daily basis, such as traffic, air pollution, availability of parking spots. It should also have an API so that programmers can have access to key data in real time to create mobile applications that can improve public services.

2. What specific challenges were you trying to address?

Public service delivery in the city government lacked base line analysis and impact measurements, hence most public projects were inefficient and a waste of public resources. To improve the way we address public issues, innovation is necessary. At the lab we used a open innovation process which involves working with external stakeholders in co-creating solutions for public problems. This five step iteration process is based on an experimental approach, where the city government collaborated with citizens and organization to generate high impact solutions. An important aspect in on the use of data to have a baseline and measure the impact of these solutions.

Impact Cycle Graphic

Collective intelligence through collaboration and citizen participation provides more and better insights. These external stakeholders were mapped and addressed, based on their expertise and type of involvement they can have with the government, depending on the issue we want to address. We call it the innovation ecosystem and it consists on individuals and organization on different public and private fields, locally and internationally.

Link Ecosystem Graphic

 

Continue reading “An In-Depth Look at Open Data in Quito”

Africa Evidence Network Launches New Survey

By Ruth Stewart, Chairperson, Africa Evidence Network

The Africa Evidence Network has launched a new survey, which can be found at https://goo.gl/forms/jDaQyLzqdxwQY7Qj2.

We want to know where existing capacity for evidence maps, systematic reviews, and other forms of syntheses lies across Africa. This survey takes no more than 10 minutes. The deadline is Tuesday, 28th February.

If you have not conducted this kind of research before but you would like to, or indeed you are more interested in how systematic reviews, evidence maps and syntheses might be useful as part of research frameworks or decision-making frameworks, please complete the survey. After the initial questions, you can skip to the final section and tell us more in the comments box.

The Africa Evidence Network (AEN) is a community of people who work in Africa and have an interest in evidence, its production and use in decision-making. The Network is supported by the Africa Centre for Evidence within the University of Johannesburg and includes researchers, practitioners and policy-makers from universities, civil society and government. www.africaevidencenetwork.org