Evidence-Informed Policymaking Reading List | June 2019 — June 3, 2019

Evidence-Informed Policymaking Reading List | June 2019

What to Read this Month

Reliable, Accurate Information Vital to Policymaking in Africa | Njiraini Muchira, The East African
“The policies are often not implementation ready-they are big in citing evidence to justify why action should be undertaken but weak in using evidence to determine what cost-effective interventions to implement.”
The Executive Director of the African Institute for Development Policy, Eliya Zulu, makes the case for governments to invest in research that informs how to effectively implement policies, and to create training and incentives for government personnel to use evidence in decision making.

Data Roadmaps for Sustainable Development | Global Partnership for Sustainable Development Data
“Interoperability of data and systems continues to be a major barrier for the effective collection, distribution, and use of data, and a barrier for countries trying to set up more modern data ecosystems.”
The seven countries engaged in the data roadmap process have several challenges in common: filling data gaps, sharing data across government agencies and the private sector, increasing political commitment to invest in data, and developing data skills among government officials.

Scaling Social Policy: Five Lessons from Brazil | Megan Dent, Apolitical
“Because Bolsa Família already works with low-income families, the Ministry had instant access to a list of vulnerable households with children under three.”
New antipoverty and social protection programs need a lot of detailed data to identify the poor households that will be eligible for program services; in Brazil, a new early childhood development program was able to utilize already-existing data from a national cash transfer program to target beneficiaries and scale up across the country.

Public Servants and Political Bias: Evidence from the UK Civil Service and the World Bank | Stefan Dercon, VoxDex
“An experiment shows that public servants make errors when interpreting data, incorrectly concluding that it aligns with their ideological preferences.”
The article presents further evidence that even skilled professionals are subject to confirmation bias; peer review / quality assurance units, red teaming, and other strategies could help staff correctly interpret the results, and be more open to results that challenge their preconceptions.

And if you haven’t read it already, check out:
The Politics of Evidence: From Evidence-Based Policy to the Good Governance of Evidence | Justin Parkhurst, London School of Economics
“A heavy focus on individuals as the driving force to improve the use of evidence in policymaking raises two particular issues. The first has to do with the roles of researchers, who are under increasing pressure to ensure that the research evidence they produce is ‘used’ or ‘taken up’. This risks encouraging researchers to have political influence, a role that they are neither trained to do nor one that many feel they have the mandate to take on. The second problem is that such efforts can have a limited duration of impact, given that both researchers and decision makers will naturally change over time or move on from existing positions.”
Parkhurst advances a holistic ‘good governance of evidence’ approach, defined as the use of rigorous, systematic and technically valid pieces of evidence within decision making processes that are representative of, and accountable to, populations served.


What to Watch

Ghana’s Infrastructure: The Mystery of Misspending | International Growth Centre

“But just as the problem was hidden within the data, so was the solution”
Still one of our favorite examples of how administrative data can be used to diagnose and overcome challenges to effective implementation of government policies. Well worth a watch!


Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.

 

Evidence-Informed Policymaking Reading List | May 2019 — May 6, 2019

Evidence-Informed Policymaking Reading List | May 2019

Contribute to a New Blog Series!

How Do We Make Research More Useful? Results for All is partnering with the Global Development Network on a new blog series that aims to capture the point of view and voice of evidence users. See here for more guidance. We welcome all contributions.

What to Read this Month
“In order to facilitate swift access to information, the foreign researcher spends money into the research environment, thus creating a peculiar market for knowledge production.”
The authors question the role of money in the production of knowledge in post-conflict contexts (e.g. payments made to interviewees and local researchers), arguing that it can restrict independent local research and exacerbate power asymmetries, and merits further scrutiny.
“Given the evolutionary roots of human cognition, we have not been surprised to find that people all over the world are prone to the same decision-making biases.”
The author explains that because people share similar human traits and biases, behavioral insights can contribute to tackling complex development challenges in a variety of cultures and economies. She emphasizes the importance of understanding context to tap into the full potential of behavioral insights and describes recent work in low- and middle- income countries.
“Foundations and evaluators will better serve the social sector by moving toward a relationship in which evaluators serve as conduits of knowledge that gather and aggregate insights across diverse contexts and organizations.”
Like with governments, philanthropic foundations often struggle to use the findings of evaluations to inform decisions and make programmatic changes. Most evaluations focus on a single foundation or program rather than sharing knowledge across institutions in the same field. This article elaborates on these challenges and proposes solutions alongside examples.
“Colleagues and other federal or state government agencies were cited as the most important sources of research information while internal agency staff were the most frequently consulted source of policy information.”
A survey of 2,000 public servants in Australia provides more evidence that academic research is not a top source of information in government decision making.
“This episode illustrates how hard it is to base policy on solid evidence when political events are moving quickly and biased, politically motivated actors are grasping at whatever straws they can find.”
In this sobering opinion piece, the author uses the example of recent increases in the local minimum wage to highlight how political decisions must at times be taken before conclusive evidence is available.

What We’re Working On

How can you strategically communicate using evidence to influence policy or program changes, mobilize supporters to take action, or seek input from citizens? We recently facilitated a workshop on communicating evidence for 3ie members using our Agency Roadmap for Building Strategic, Evidence-Based Communications Plans. The roadmap outlines key questions to help agencies identify their communications goals and target audiences, create compelling messages and engage allies, select best-fit tools and channels, and measure the results of communications activities.

What challenges do you face in communicating evidence to key audiences within or outside your agency, and what tools and resources would help you do so? Do you have any feedback on our Roadmap? We hope to hear from you!
Evidence-Informed Policymaking Reading List | April 2019 — April 2, 2019

Evidence-Informed Policymaking Reading List | April 2019

Invitation to Contribute to a New Blog Series

How Do We Make Research More Useful? Results for All is partnering with the Global Development Network on a new blog series that aims to capture the point of view and voice of evidence users. See here for more guidance. We welcome all contributions.

What to Read this Month

“Informing public policy will require both better research as well as better public sector incentives.”

An interesting study that highlights the important role organizational incentives play in building a culture of evidence use in government. The authors found that when public officials are given authority over decision making, they invest in more accurate beliefs about the constituents they serve, and when they work in an organization that rewards information gathering, they do more of it.

“Guidance based on best practice and success stories in particular, often reflect unequal access to policymakers, institutional support, and credibility attached to certain personal characteristics.”
The authors explain that the “how to” advice that is commonly offered to academics who seek to influence policy – ensure your research is relevant and of high quality; understand the policymaking process; build relationships with policymakers, etc. – does not address the different contexts and cultures in which policies are made. They argue that this advice helps so long as researchers also have an understanding of their wider role in society and the complexities of the policymaking process.

“We need to reframe how we think about risk in a world of abundant data.”
The author agrees that while robust regulations preventing misuse of data are critical, we also need regulations to ensure that when data can be used for the public good, it always is, and cites several compelling examples to help make this case.

In this interactive policy game, players must work out how to manage spiraling demand on health services in a fictional country.
A quick and fun game in which you, as a policymaker in the Ministry of Health, must choose between options like digitizing services, launching a communications campaign, or adopting an evidence-based policy from another country. Try it out!

 “The automatic production and refinement of data analyses allow for faster, smarter decision making – and better predictions of, and responsiveness to, events.”
In this short opinion piece, the authors explain why learning about data and programming is important for policymakers, and then offer a short, hands-on course where you can learn and try basic coding.

“Perversely then, evidence-based policy is actually preventing us from innovating and collecting any new evidence or insights about what might work.”
Making a parallel to the faulty sub-prime loans that triggered the 2008 financial crisis, the author uses what he sees as faulty and inconclusive evidence behind a U.S. prison visitation program to tell a cautionary tale about evidence-based decisions that discourage further experimentation.

What We’re Working On

In case you missed it, take a look at our recent blog post to read key insights we gleaned from our last year of work, and see what we’re up to next.

Evidence-Informed Policymaking Reading List | March 2019 — March 4, 2019

Evidence-Informed Policymaking Reading List | March 2019

What to Read this Month

Over the last year, our work at Results for All has focused on exploring how to facilitate opportunities for government policymakers to share knowledge, experiences, and lessons learned in accelerating the use of evidence to inform policy. Here we distill our observations into 7 key insights and principles for peer-to-peer learning for government. Do you agree? What’s missing? Please reply to this email or comment on the blog to share your thoughts!

“The findings suggest that, while national trends may be useful for regional and global policy advocacy, they can also be misleading.”
While Kenya has made remarkable progress in reducing child mortality, none of its 47 counties achieved their MDG goals for child mortality. The article highlights research findings that show how national aggregate levels of child mortality in Kenya mask county-level progress. The authors note the importance of setting county-specific targets and collecting data at the subnational level to better achieve and monitor progress in achieving development goals.

“Technology and digital advancements provide new sources of data that are invaluable for sustainable development, but we can only take full advantage of these opportunities if core data systems are working well.”
Too many poor people are invisible in the data and numbers that inform government decisions; investing in new data sources and collection systems is essential for policymakers to allocate resources to the people who need them.

“Based on my experiences [with this] there are seven types of policy makers, and knowing your counterpart’s type might be helpful in figuring out how to pitch your discussion.”
The author suggests that by understanding the type of policymaker they are engaging with, researchers can better tailor their approach to meet policymakers where they are, and offers 7 aspects of policymakers to consider in discussions.

“Studies to date suggest that encouraging evidence-based policymaking approaches that move beyond merely valuing evidence to actually investing in tools and personnel to reconfigure existing routines and practices are likely to yield practices that more consistently map to the evidence and yield better outcomes.”
While government policies and designated funds that incentivize evidence-based programs are helpful, they do not by themselves cultivate the right conditions for evidence to be used systematically to inform government decisions; instead, the author explains that skills, infrastructure, relationships, and trust are essential.

The guide shows how state and local governments are creating stronger, results-focused partnerships that produce meaningful and sustainable outcomes for communities in need. To see some of these recommendations in action, watch this new video featuring the Best Starts for Kids initiative from Washington State, USA and its community-focused contracting strategies to improve equity and outcomes for children and families.

Introducing…
Evidence Champion of the Month: Mapula Tshangela
Director of Climate Change Mitigation Sector Plan Implementation
Department of Environmental Affairs of South Africa

 

Mapula.png

The South African Constitution requires that the Department of Environmental Affairs (DEA) ensure the right to an environment that is not harmful to the health and well-being of South Africans, in part by mitigating climate change, promoting conservation and biodiversity, and securing ecologically sustainable development and natural resource use. To achieve these diverse and challenging objectives, DEA has built strong partnerships with research entities and strategically invested in generating science and evidence relevant to the environment sector and to both short-term and long-term DEA priorities.

Ms. Tshangela has been at the forefront of this effort, for example, working with colleagues toincorporate new indicators in annual staff performance plans that explicitly link evidence production and use with policy development. In her current role, she is exploring the evidence in climate change mitigation and related action plans by local government and the private sector, together with colleagues from government and academia. “Forming and sustaining trans-disciplinary partnerships such as these has always been key to our efforts to intentional prioritization, gathering, and use of evidence,” she says. Tshangela has a background in chemistry, and symbolically wears a lab coat to work every day to demonstrate the importance of using evidence in climate policy design and implementation.

“Policy implementation change may take a decade or more,
but we can always go back to the evidence we used systematically
and documented to learn from our past decisions.”


Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.

 

7 Insights for Peer Learning Approaches in Evidence-Informed Policymaking — March 1, 2019

7 Insights for Peer Learning Approaches in Evidence-Informed Policymaking

Put yourself in the shoes of a government policymaker for just one day. Your responsibilities for the day are likely to include some management and administrative tasks. Perhaps you will make remarks at a public event or be called on to brief your Minister on short notice. Maybe you will end your day in meetings with external partners and constituents.

Although you may not make important policy-level decisions on a day-to-day basis, when confronted with a policy question, the extent to which you seek out evidence to inform the issue you are trying to address, will depend on your ability to access the information you need when you need it, as well as the expectations placed on you by your agency, constituents, and fellow policymakers. Even with the right intention, you may find it a challenge to systematically use evidence in your work without clear direction or guidelines from your agency on how to find, use, and communicate evidence in policymaking. You may wonder about evidence use in other agencies and direct your staff to look for examples from other contexts that could inspire a new approach in your office, and wish you had opportunities to network and learn from your peers in government.

Results for All’s work over the last year has been focused on exactly this – inspiring and accelerating progress in evidence use by bringing policymakers together to build community and learn from each other. We have highlighted what we learned about peer learning networks through our consultative and reflective research process in earlier blogs and reports, including this mapping of over 50 networks, highlights and a summary report from our peer learning workshop, and this article on the potential of networks in Africa. Here we distill these learnings into 7 insights for peer learning approaches and collaborations that aim to advance evidence use in government.

  1. Anchor your evidence use conversation in policymaker priorities. A sweeping conversation about systems, processes, and capabilities for evidence use can lose meaning and feel vague. Instead, identify policy or thematic priorities for your network and use them as an entry point to examine institutional capacity to produce and use evidence. Do policymakers have access to timely, quality, and relevant evidence to inform their policy priority? Are there gaps in technical know-how, analytical ability, or motivation that inhibit a policymaker’s ability to routinely find, use, and report on the evidence that informs this priority? Is there a role for knowledge brokers to play in facilitating increased interaction between evidence producers including the research community, citizens, and practitioners, and policymakers as evidence users, to better inform priorities? What types of incentives could help to facilitate the use of evidence in their specific policies? Start with the problem and help policymakers think about the systems, processes, and capabilities that can be a part of the solution – but keep the conversation rooted in addressing policymaker priorities.
  1. Understand policymaker attitudes toward evidence use. Don’t stop with a conversation about challenges or the tangible dimensions of an evidence use culture such as data systems, knowledge management platforms, or evaluations. Go below the surface to deeply understand why policymakers use evidence and if and how it aligns with the way evidence is used in their agency. Explore what it would take to shift underlying assumptions towards greater evidence use – is evidence-informed policymaking perceived as being too complicated, too time-consuming, or too costly? Do staff have the resources, time, and training to find, share, and use evidence? Finally, don’t shy away from engaging on the messy dimensions of the policy process, including politics and power. Take time to understand how these factors facilitate or impede evidence use in order to help policymakers create effective strategies for addressing their policy priorities.
  1. Pay attention to practical and experiential learning. Policymakers value opportunities to network and learn from like-minded colleagues. Specifically, we note a high level of interest in tacit “how to” problem solving learning and exchange on the political and process oriented-dimensions of using evidence, to gain experiential insights on questions such as “how did you did you develop a learning agenda in your evaluation plan” or “how did you use evidence to build buy-in for your policy?” We are inspired by the Joint Learning Network for Universal Health Coverage’s approach to drawing on member experiences to co-produce practical tools and knowledge products, and the Collaborative Africa Budget Reform Initiative’s (CABRI) problem-driven approach that facilitates experiential learning in addressing budget reform challenges. Find ways to strike a balance between expert-led presentations and interactive sessions that allow policymakers to draw on each other’s experiences to address evidence use challenges. Finally, ensure there is ample time for networking and building deep connections that can outlast an event or formal platform.
  1. Don’t neglect the soft skills. Beyond a technical grasp of the evidence – to find, synthesize, and use complex information in decision making – policymakers also need the skills to effectively communicate research and policy priorities internally across government and externally to their citizens. They also need communication skills to build and sustain relationships with key partners, including the research and academic communities; to engage with stakeholders in building buy-in for a policy or soliciting feedback to improve implementation; and to think strategically about longer-term evidence needs for their policy priorities. While there are many guidelines and toolkits geared towards improving how researchers communicate with policymakers, we have found fewer resources offering practical tips and guidance specifically targeted to policymakers. Engage in conversations to identify gaps in soft skills and find ways to help policymakers improve how they to talk about and act on evidence. 
  1. Stay for a while. Relationships and networks play an important role in building a shared identity for jointly problem-solving and co-creating strategies to improve evidence use, and for encouraging the spread of ideas and practices. But it takes more than one engagement to build the kind of trust that leads to an open and honest exchange of experiences. Tacit “how to” knowledge exchange on the softer dimensions of evidence-informed policymaking, in particular, is predicated on trust and close interaction. This type of exchange does not lend itself well to a one-off workshop, so governments, development partners, and funders, please consider long-term approaches for deepening peer-to-peer learning and accelerating progress in evidence use.
  1. Implementation challenges keep policymakers up at night.The term policymaking can be misleading in suggesting a focus on making or creating policy, but we have heard repeatedly from policymakers that one of the biggest challenges they face is translating policy to the delivery of services for citizens. Policies that are formed at the national level without considering local-level priorities and an implementation plan are often doomed from the start. Focus on finding ways to support policymakers in taking a systematic and iterative approach to using evidence in both policy design and implementation to avoid this disconnect. A good place to start could be a diagnostic self-assessment process that allows policymakers to deeply explore and understand barriers, opportunities, and strategies for strengthening evidence use practices in policy implementation. 
  1. Create a safe space for authority figures and doers.Make an effort to ensure all voices are heard – senior-level policymakers with the authority to approve follow-up activities as well as practitioners and mid-level managers and analysts who will take the work forward. In some contexts, you may need to convene senior-level policymakers and practitioners separately. If your network targets multiple levels of policymakers, pay attention to the dynamic in your meeting room. Have you created a safe and inclusive space that welcomes different viewpoints and perspectives on evidence use?

What’s next for Results for All? With a generous third round of funding from the William and Flora Hewlett Foundation we are exploring how to bring these insights and the resources and tools we are developing to strengthen evidence use in government, to existing global initiatives and platforms that are broadly committed to strengthening policy and practice-level decisions in government. Rather than forming a new network dedicated exclusively to evidence use, this approach enables us to build on existing efforts and to collaborate with partner initiatives and participating governments to improve the use of evidence in their mandates. We welcome your feedback on what we have learned so far and would be interested to know of any initiatives that have an appetite for engaging with us on evidence use. We’ll be sharing regular updates on our new work with partners in our blog and monthly Evidence-Informed Policymaking Reading List, which you can subscribe to by emailing info@results4all.org.

 

 

Evidence-Informed Policymaking Reading List | February 2019 — February 7, 2019

Evidence-Informed Policymaking Reading List | February 2019

What to Read this Month

“Allocating even a small amount of resources and personnel to apply the lessons from data and impact evaluations in policy design and implementation, and setting up systems that facilitate this institutional learning, is a crucial part of building a culture of data-driven and evidence-informed decision-making.”
The report draws on interviews with officials from 15 partner agencies and presents key insights for organizations and governments, including: the importance of explicitly making it someone’s job to apply evidence in policy design; creating dedicated spaces where evidence use is rewarded; and investing in administrative data collection and inter-agency data sharing.

Uncovering the Practices of Evidence-Informed Policy-Making | Louise Shaxon, Public Management & Money

“Although the analysis is in its early stages, it does suggest that government departments and agencies concerned to implement a holistic approach to evidence-informed policy-making could consider basing their strategies on seven core practices.”
Drawing on examples from the U.S. (Results for America’s Invest in What Works Federal Index) and the UK’s Department for Environment, Food & Rural Affairs (Defra), the paper identifies a minimum set of practices to help governments take a holistic approach to evidence-informed decision making.

“Most human beings (except us few data evangelists) are not personally moved by a long list of data gaps. But they can be deeply moved by stories of how data help solve problems they care about.”
On the eve of a global donors meeting on financing for data for the SDGs, the author encourages participants to think like a data user and take a portfolio approach to data investments.

Looking for stories of how data help solve problems? Consider these 5 short, to-the-point examples:

“But we think that what [the four cases] all had in common was a culture that emphasised, above all, responding to local needs.”
In this blog, the author shares insights from case studies that were commissioned to understand how researchers engaged with decision makers and the impact of these efforts. The author notes that a common emphasis on addressing local needs in all the case studies contributed to successful research partnerships, and shares additional insights for achieving research impact, including the importance of taking an iterative approach, networking, and ensuring the quality of evidence.

What We’re Working On

Read how global development funders are supporting governments to use data and evidence in policy decisions in our latest report. This rapid review summarizes insights from our interviews with 23 bilateral, multilateral, and philanthropic funding organizations to understand how global development funders are investing in evidence-informed decision making in government. It discusses the constraints governments face in promoting the systematic use of evidence, what funders are doing to help address these constraints, and what else is needed to build a broad culture of evidence use in governments of the Global South. Its aim is to inform a conversation among development partners that catalyzes collective action to strengthen evidence use in government. Join the conversation by sending us an email at info@results4all.org. We welcome your ideas and comments!

 

Evidence-Informed Policymaking Reading List | January 2019 —

Evidence-Informed Policymaking Reading List | January 2019

Top Reads from 2018
Happy New Year from Results for All! As we begin to launch our activities for this year, here are the most popular items from our 2018 reading lists:
Peer learning networks are powerful platforms for bringing countries together to exchange information, learn, and collaborate. Results for All conducted research to assess whether there is demand in government for a peer learning network that could help policymakers share experiences and amplify solutions to advance the use of evidence in policy. To inform a network strategy and avoid potential overlap with existing initiatives, we engaged in a comprehensive process of identifying, researching, and mapping active networks that have similar missions.

The subsequent report classifies 50+ networks for peer learning and evidence use in government; describes 8 modes of engaging network members; and synthesizes research and key informant interviews into 13 lessons on network organization and engagement. We then match select networks against 5 key criteria, and conclude that new efforts premised on these criteria could support evidence-informed policymaking and add value to current initiatives.
What happens when teams of government policymakers from nine countries meet in Nairobi, Kenya to discuss using evidence to improve policy implementation? The peer-learning exchange we hosted from July 23-25 featured sessions on the use of administrative data to inform policy implementation, how to incentivize evidence use and build an evaluative culture, and how to engage citizens in data collection, among others.
Curious about what happened in Nairobi, and what we learned? You can watch videos, download the summary report, and explore the insights here.
“Impact evaluations are an important tool for learning about effective solutions to social problems, but they are a good investment only in the right circumstances. In the meantime, organizations must build an internal culture in which the right data are regularly collected, analyzed, and applied to manage implementation and improve programs.”
A comprehensive summary of the authors’ recent book, in which they explain why impact evaluations are not always useful, describe ten scenarios in which alternatives to impact evaluations make sense, and provide four guiding principles for collecting data that will be used to inform decisions and improve programs.
“In contrast to theories of change that posit that more rigorous evidence will have a greater influence on officials, we have found the opposite to be true.”
Excerpts from a paper reflecting on ten years of trying to improve public health in Guatemala. A key takeaway from the paper is the more community members were involved in generating and presenting evidence, the greater the likelihood that it would be used to address service delivery challenges.
“If policymakers or those who support policy processes are interested in supporting evidence use and innovation during policymaking, they would be wise to consider amplifying the heterogeneity in a network by adding new actors from diverse backgrounds and finding ways for them to interact with other network actors.”
The authors explore the association between evidence use and policy network structure in Burkina Faso and suggest that heterogenous networks comprising different actors and organizations are likely to increase the use of evidence and policy innovation.
“…we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work. These assumptions determine whether values become empty statements and artifacts gather dust or, ultimately, whether evidence use becomes a cultural norm.”
Policymakers often speak of creating a culture of evidence use and learning. In this blog, the Executive Director of Results for All describes different levels of culture, how they apply to evidence-informed policymaking, and six steps governments can take to shift their cultures toward evidence use.


Introducing…
Evidence Champion of the Month: Ebenezer Appah-Sampong
Deputy Executive Director for Technical Services, 
Environmental Protection Agency of Ghana

Established in 1994, the Environmental Protection Agency (EPA) is the leading public body for protecting and improving the environment in Ghana as well as seeking common solutions to global environmental problems. To diagnose and address environmental protection issues, the EPA needs to produce, source, analyze, and use a variety of evidence, including qualitative and quantitative research, monitoring and evaluation of programs, tacit knowledge from practitioners, and feedback from citizens. For example, evidence from drone mapping of small-scale mining sites was used in the development and implementation of a digital Compliance Monitoring System to address non-compliance of mining permit conditions, which was a major contributor to environmental degradation.

To better understand the factors affecting the use of evidence at the EPA, Mr. Sampong and his team worked with INASP and Politics & Ideas to pilot a diagnostic assessment and develop an Evidence-Informed Policymaking Change Plan. Mr. Sampong also led a team from EPA to participate in a peer learning workshop for government policymakers hosted by Results for All in July 2018, in which he discussed how better data collection and community engagement would be key to the success of a waste separation and recycling program in Ghana’s capital, Accra.

“To be evidence champion of the month means everything to me. It motivates me to do more and also encourage my peers in other agencies to see the value in evidence for their work. Evidence is the game changer in our efforts to become a global center of excellence in environmental protection.”

Evidence-Informed Policymaking Reading List | December 2018 — December 4, 2018

Evidence-Informed Policymaking Reading List | December 2018

What to Read this Month

“We propose that future programmes should consider multiple interventions that can be combined to meet needs as individuals, teams and organisations seek to increase their awareness, capability and implementation of evidence-informed decision-making.”

The authors, all researchers at the University of Johannesburg’s Africa Centre for Evidence, construct a model based on five dimensions: a cycle in which decision makers move between awareness of evidence-informed decision making (EIDM) to capability for EIDM and actual evidence use; the different needs and evidence use capabilities of individuals, teams, organizations, and institutions; different outputs related to increased capacity for EIDM, including but not limited to specific policy changes; influential contextual factors that influence evidence use; and a variety of interventions which can create incremental changes and complement each other.
“Despite these enormous investments in M&E systems, staff in donor and government agencies report little to no utilization of M&E data for decision-making.”
The authors discuss the importance of first understanding the available “decision space” in order to better anticipate data needs and system requirements, rather than starting with technical considerations of what data must be captured and how. They define decision space as  institutional policies, programmatic goals, individual operational tasks, incentives, and authority over financial and human and resources, and argue that this approach can increase the use of data for decision making.
“‘Facts are one part; just as guilt does not inspire initiative, people will not act on facts alone. We are inspired to act by emotional and physical experience.'”
An interesting exploration of how policymaking can borrow approaches from art to: improve how we understand and connect to data (design an interactive three dimensional visualization); explore how future research will be influenced by the culture of tomorrow’s scientists (consider a futuristic museum to stimulate thinking about the way research is currently conducted and how it might change in the future); and foster unconventional thinking (use a participatory arts approach to fill in a blank canvas).

Calls for Greater Role for Universities in Policy-making | Gilbert Nakweya, University World News

“Building the capacity of research in African universities would strengthen their contribution to policy-making and innovation…”
A brief article summarizing perspectives from the Sixth African Higher Education Week held recently in Nairobi, that makes the case for investing in African universities as research partners to inform development policy.
“A million-dollar evaluation of a home visiting program for new mothers might find it didn’t improve health outcomes – not because it’s a bad program, but simply because workers didn’t make all the scheduled visits. Costly, multi-year trials can end up revealing only that you can’t help people with a policy that’s not actually implemented.”
A new round of funding from the World Bank’s Strategic Impact Evaluation Fund (SIEF) will focus on quicker and lower-cost ‘nimble evaluations’ to test the effectiveness of different ways to implement policies and programs, rather than waiting until after they are implemented (or not) to measure their results.
“My point here is simple – while we should approach interactions with policymakers and practitioners with optimism, we should not expect it to work miracles or remove barriers that are actually present.”
The author argues that what is needed is not more study about solutions for improving the environmental science-to-policy interface, which don’t seem to be evolving, but action to address institutional barriers to progress and reform incentive structures.
“Evidence on its own is unlikely to foster change unless accompanied by effective campaigning, political mobilization and other forms of influencing.”
The article offers insights from Oxfam’s experience and highlights four strategies that have contributed to the effectiveness of its campaigns: 1) developing an understanding of a political system to understand what needs to change, who has the power to achieve change, and how to achieve change; 2) getting timing right, framing, and presenting evidence to maximize influence on target audiences; 3) drawing on a range of strategies to influence policy both on the inside and outside; and 4) reflection and iterative improvements.

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.
Evidence-Informed Policymaking Reading List | November 2018 — November 6, 2018

Evidence-Informed Policymaking Reading List | November 2018

What to Read this Month
“If policymakers or those who support policy processes are interested in supporting evidence use and innovation during policymaking, they would be wise to consider amplifying the heterogeneity in a network by adding new actors from diverse backgrounds and finding ways for them to interact with other network actors.”
The authors explore the association between evidence use and policy network structure in Burkina Faso and suggest that heterogenous networks comprising different actors and organizations are likely to increase the use of evidence and policy innovation.
“Geospatial impact evaluations (GIEs) use precisely georeferenced intervention data and outcome data to establish a counterfactual retroactively, eliminating the need to randomly assign individuals, firms, or communities into treatment and control groups at the outset of a program.”
A short article describing how USAID used geospatial data to understand the impact of a $900 million investment in rural infrastructure in the West Bank/Gaza. The evaluation team found strong evidence that local economic output, as measured by remotely sensed nighttime light intensity, increased as a result of the rural infrastructure program. The authors highlight GIEs as a promising approach for rigorously evaluating programs in fragile states where the collection of baseline and endline data is challenging and costly.
“Nevertheless, the process has taken a relatively long time and has required a very substantial body of evidence generated from interventions reliably funded by donors motivated for change, the implementation of multiple knowledge transfers strategies, the efforts of collective and individual political entrepreneurs, effective and patient advocacy coalitions, aided by a major window of opportunity that they seized and used to good effect.”
The authors – a researcher and policymaker team – conducted a reflective analysis of a major change in health financing policy in Burkina Faso between 2008 and 2018, which they supported in their respective roles. They share practical lessons for strengthening evidence-informed decision making, including the importance of: persistent and consistent production of rigorous and useful knowledge; fostering early interaction and engagement between the research and decision making communities; understanding the political and socio-economic context in which decisions are made; and seizing windows of opportunity for change.
 
“Without these forms of documentation, population statistics – which inform a range of policy decisions – are incomplete at best, and wrong at worst. Many low-income countries base their poverty estimates on data that is more than a decade old.”
The authors make a compelling case for policymakers to invest in data that help inform critical decisions like where to build schools or direct medical resources, and to make that data publicly accessible so that communities and social entrepreneurs can help identify solutions that work. “Policymaking without high-quality public data is governing by guesswork,” they write.

“In the current context of low priority, and weak institutional support and technical capacity to enable evidence use in decision making and debate in African parliaments, the network’s activities respond to some of the key barriers hindering parliamentarians from using evidence in their work.”
The authors describe their research to understand the contribution of the Network of African Parliamentary Committees on Health (NEAPACOH) to the evidence ecosystem in African parliaments. Annual network meetings serve as a platform for sharing evidence and building demand and capacity for increased use, strengthening partnerships between MPs and researchers, as well as creating a sense of accountability and competition for following through on commitments made each year. An additional key insight from their research is the importance of creating a mechanism for sharing forward commitments made at annually held regional workshops with national parliaments to make better progress in realizing them.

 

What We’re Working On

 

How are public sector officials incentivized to use evidence routinely in their work, whether to inform major policies and decisions, design or alter programs, or guide implementation? Our new series highlights strategies that government agencies around the world have used to create incentives for using evidence in decision making. Take a look at our first five case studies from Mexico, Sierra Leone, and South Africa!

 

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.

 

 

Evidence-Informed Policymaking Reading List | October 2018 —

Evidence-Informed Policymaking Reading List | October 2018

What to Read this Month

“To put it bluntly, decades of stunning progress in the fight against poverty and disease may be on the verge of stalling. This is because the poorest parts of the world are growing faster than everywhere else; more babies are being born in the places where it’s hardest to lead a healthy and productive life. If current trends continue, the number of poor people in the world will stop falling — and could even start to rise.”
A great report on the world’s progress and likelihood of achieving the Sustainable Development Goals, with easy-to-digest graphs on key indicators, and stories behind the data about family planning, HIV, education, and agriculture. Take a moment to test your own knowledge with the interactive, six-question “data check” quiz on global poverty and population trends.

“UIS estimates that solid data on learning — gauging whether policies and programs are working, or reforms are needed — could improve education spending efficiency by 5 percent, generating $30 million/year in savings in the average country, paying for the assessments hundreds of times over.”
The author stresses that three years after the SDGs were adopted, there are still 100 countries with no data on student learning, and that two international literacy and math tests offer good standards for measuring progress toward global education goals.

“The average cost of an ambulance ride is $500, and in 2017 the County was able to avoid approximately 1,300 unnecessary rides — a health care system cost-savings of approximately $260,000.”
This case study documents how health and human services officials teamed up with fire and rescue leaders to address the rising volume of 911 emergency medical services calls from a small number of frequent callers. By sharing data across agencies, partnering with local hospitals, and providing home visits to some high-frequency callers, the County has seen a more than 50 percent reduction in 911 calls from the residents engaged in the initiative, saving public resources while proactively providing residents with the services they need. To learn more about how US local governments are using data and evidence to improve government performance and outcomes for residents, see additional case studies here.
 
“Only by finding out what doesn’t work — and being transparent about it — can we identify where money can be saved and re-invested in effective interventions.”
The author shares five lessons from his work as policy advisor for the UK What Works Network: 1) RCTs are not the only way of assessing impact; 2) it can be socially acceptable to experiment on children (see Education Endowment Foundation’s work; 3) it is equally important to learn from what does not work; 4) evidence use doesn’t happen on its own; and 5) short-term effects do not necessarily translate to long-term outcomes.

What to Watch

“The way an issue passes from a vague idea into a piece of legislation or concrete sort of proposal shapes the kind of research you do…You need to consider where you are in this policy funnel, and that should shape the research.” (See 15:00 to 17:30)
Duncan Green, Oxfam Strategic Advisor, discusses how to understand the power and change processes within the system you are trying to influence, how to think about framing and timing of the issue, and how to combine research with media campaigns and lobbying to have policy influence.

A short online course for public servants to help you understand how to craft a narrative about your policy idea; write a compelling policy article or blog; or build a great presentation about your policy or idea.

What We’re Working On

We recently returned from Pretoria, South Africa, where we attended the Evidence 2018 conference hosted by the Africa Evidence Network, AEN. You can read about the event on the AEN blog and see pictures here. During the conference, we led two sessions on the potential of networks and peer learning opportunities to support government policymakers in advancing the use of evidence in decision making. You can read some of our thinking on the subject in a recent Results for All blog post, here.

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.