Evidence-Informed Policymaking Reading List | April 2019 — April 2, 2019

Evidence-Informed Policymaking Reading List | April 2019

Invitation to Contribute to a New Blog Series

How Do We Make Research More Useful? Results for All is partnering with the Global Development Network on a new blog series that aims to capture the point of view and voice of evidence users. See here for more guidance. We welcome all contributions.

What to Read this Month

“Informing public policy will require both better research as well as better public sector incentives.”

An interesting study that highlights the important role organizational incentives play in building a culture of evidence use in government. The authors found that when public officials are given authority over decision making, they invest in more accurate beliefs about the constituents they serve, and when they work in an organization that rewards information gathering, they do more of it.

“Guidance based on best practice and success stories in particular, often reflect unequal access to policymakers, institutional support, and credibility attached to certain personal characteristics.”
The authors explain that the “how to” advice that is commonly offered to academics who seek to influence policy – ensure your research is relevant and of high quality; understand the policymaking process; build relationships with policymakers, etc. – does not address the different contexts and cultures in which policies are made. They argue that this advice helps so long as researchers also have an understanding of their wider role in society and the complexities of the policymaking process.

“We need to reframe how we think about risk in a world of abundant data.”
The author agrees that while robust regulations preventing misuse of data are critical, we also need regulations to ensure that when data can be used for the public good, it always is, and cites several compelling examples to help make this case.

In this interactive policy game, players must work out how to manage spiraling demand on health services in a fictional country.
A quick and fun game in which you, as a policymaker in the Ministry of Health, must choose between options like digitizing services, launching a communications campaign, or adopting an evidence-based policy from another country. Try it out!

 “The automatic production and refinement of data analyses allow for faster, smarter decision making – and better predictions of, and responsiveness to, events.”
In this short opinion piece, the authors explain why learning about data and programming is important for policymakers, and then offer a short, hands-on course where you can learn and try basic coding.

“Perversely then, evidence-based policy is actually preventing us from innovating and collecting any new evidence or insights about what might work.”
Making a parallel to the faulty sub-prime loans that triggered the 2008 financial crisis, the author uses what he sees as faulty and inconclusive evidence behind a U.S. prison visitation program to tell a cautionary tale about evidence-based decisions that discourage further experimentation.

What We’re Working On

In case you missed it, take a look at our recent blog post to read key insights we gleaned from our last year of work, and see what we’re up to next.

Evidence-Informed Policymaking Reading List | March 2019 — March 4, 2019

Evidence-Informed Policymaking Reading List | March 2019

What to Read this Month

Over the last year, our work at Results for All has focused on exploring how to facilitate opportunities for government policymakers to share knowledge, experiences, and lessons learned in accelerating the use of evidence to inform policy. Here we distill our observations into 7 key insights and principles for peer-to-peer learning for government. Do you agree? What’s missing? Please reply to this email or comment on the blog to share your thoughts!

“The findings suggest that, while national trends may be useful for regional and global policy advocacy, they can also be misleading.”
While Kenya has made remarkable progress in reducing child mortality, none of its 47 counties achieved their MDG goals for child mortality. The article highlights research findings that show how national aggregate levels of child mortality in Kenya mask county-level progress. The authors note the importance of setting county-specific targets and collecting data at the subnational level to better achieve and monitor progress in achieving development goals.

“Technology and digital advancements provide new sources of data that are invaluable for sustainable development, but we can only take full advantage of these opportunities if core data systems are working well.”
Too many poor people are invisible in the data and numbers that inform government decisions; investing in new data sources and collection systems is essential for policymakers to allocate resources to the people who need them.

“Based on my experiences [with this] there are seven types of policy makers, and knowing your counterpart’s type might be helpful in figuring out how to pitch your discussion.”
The author suggests that by understanding the type of policymaker they are engaging with, researchers can better tailor their approach to meet policymakers where they are, and offers 7 aspects of policymakers to consider in discussions.

“Studies to date suggest that encouraging evidence-based policymaking approaches that move beyond merely valuing evidence to actually investing in tools and personnel to reconfigure existing routines and practices are likely to yield practices that more consistently map to the evidence and yield better outcomes.”
While government policies and designated funds that incentivize evidence-based programs are helpful, they do not by themselves cultivate the right conditions for evidence to be used systematically to inform government decisions; instead, the author explains that skills, infrastructure, relationships, and trust are essential.

The guide shows how state and local governments are creating stronger, results-focused partnerships that produce meaningful and sustainable outcomes for communities in need. To see some of these recommendations in action, watch this new video featuring the Best Starts for Kids initiative from Washington State, USA and its community-focused contracting strategies to improve equity and outcomes for children and families.

Introducing…
Evidence Champion of the Month: Mapula Tshangela
Director of Climate Change Mitigation Sector Plan Implementation
Department of Environmental Affairs of South Africa

 

Mapula.png

The South African Constitution requires that the Department of Environmental Affairs (DEA) ensure the right to an environment that is not harmful to the health and well-being of South Africans, in part by mitigating climate change, promoting conservation and biodiversity, and securing ecologically sustainable development and natural resource use. To achieve these diverse and challenging objectives, DEA has built strong partnerships with research entities and strategically invested in generating science and evidence relevant to the environment sector and to both short-term and long-term DEA priorities.

Ms. Tshangela has been at the forefront of this effort, for example, working with colleagues toincorporate new indicators in annual staff performance plans that explicitly link evidence production and use with policy development. In her current role, she is exploring the evidence in climate change mitigation and related action plans by local government and the private sector, together with colleagues from government and academia. “Forming and sustaining trans-disciplinary partnerships such as these has always been key to our efforts to intentional prioritization, gathering, and use of evidence,” she says. Tshangela has a background in chemistry, and symbolically wears a lab coat to work every day to demonstrate the importance of using evidence in climate policy design and implementation.

“Policy implementation change may take a decade or more,
but we can always go back to the evidence we used systematically
and documented to learn from our past decisions.”


Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.

 

Evidence-Informed Policymaking Reading List | February 2019 — February 7, 2019

Evidence-Informed Policymaking Reading List | February 2019

What to Read this Month

“Allocating even a small amount of resources and personnel to apply the lessons from data and impact evaluations in policy design and implementation, and setting up systems that facilitate this institutional learning, is a crucial part of building a culture of data-driven and evidence-informed decision-making.”
The report draws on interviews with officials from 15 partner agencies and presents key insights for organizations and governments, including: the importance of explicitly making it someone’s job to apply evidence in policy design; creating dedicated spaces where evidence use is rewarded; and investing in administrative data collection and inter-agency data sharing.

Uncovering the Practices of Evidence-Informed Policy-Making | Louise Shaxon, Public Management & Money

“Although the analysis is in its early stages, it does suggest that government departments and agencies concerned to implement a holistic approach to evidence-informed policy-making could consider basing their strategies on seven core practices.”
Drawing on examples from the U.S. (Results for America’s Invest in What Works Federal Index) and the UK’s Department for Environment, Food & Rural Affairs (Defra), the paper identifies a minimum set of practices to help governments take a holistic approach to evidence-informed decision making.

“Most human beings (except us few data evangelists) are not personally moved by a long list of data gaps. But they can be deeply moved by stories of how data help solve problems they care about.”
On the eve of a global donors meeting on financing for data for the SDGs, the author encourages participants to think like a data user and take a portfolio approach to data investments.

Looking for stories of how data help solve problems? Consider these 5 short, to-the-point examples:

“But we think that what [the four cases] all had in common was a culture that emphasised, above all, responding to local needs.”
In this blog, the author shares insights from case studies that were commissioned to understand how researchers engaged with decision makers and the impact of these efforts. The author notes that a common emphasis on addressing local needs in all the case studies contributed to successful research partnerships, and shares additional insights for achieving research impact, including the importance of taking an iterative approach, networking, and ensuring the quality of evidence.

What We’re Working On

Read how global development funders are supporting governments to use data and evidence in policy decisions in our latest report. This rapid review summarizes insights from our interviews with 23 bilateral, multilateral, and philanthropic funding organizations to understand how global development funders are investing in evidence-informed decision making in government. It discusses the constraints governments face in promoting the systematic use of evidence, what funders are doing to help address these constraints, and what else is needed to build a broad culture of evidence use in governments of the Global South. Its aim is to inform a conversation among development partners that catalyzes collective action to strengthen evidence use in government. Join the conversation by sending us an email at info@results4all.org. We welcome your ideas and comments!

 

Evidence-Informed Policymaking Reading List | January 2019 —

Evidence-Informed Policymaking Reading List | January 2019

Top Reads from 2018
Happy New Year from Results for All! As we begin to launch our activities for this year, here are the most popular items from our 2018 reading lists:
Peer learning networks are powerful platforms for bringing countries together to exchange information, learn, and collaborate. Results for All conducted research to assess whether there is demand in government for a peer learning network that could help policymakers share experiences and amplify solutions to advance the use of evidence in policy. To inform a network strategy and avoid potential overlap with existing initiatives, we engaged in a comprehensive process of identifying, researching, and mapping active networks that have similar missions.

The subsequent report classifies 50+ networks for peer learning and evidence use in government; describes 8 modes of engaging network members; and synthesizes research and key informant interviews into 13 lessons on network organization and engagement. We then match select networks against 5 key criteria, and conclude that new efforts premised on these criteria could support evidence-informed policymaking and add value to current initiatives.
What happens when teams of government policymakers from nine countries meet in Nairobi, Kenya to discuss using evidence to improve policy implementation? The peer-learning exchange we hosted from July 23-25 featured sessions on the use of administrative data to inform policy implementation, how to incentivize evidence use and build an evaluative culture, and how to engage citizens in data collection, among others.
Curious about what happened in Nairobi, and what we learned? You can watch videos, download the summary report, and explore the insights here.
“Impact evaluations are an important tool for learning about effective solutions to social problems, but they are a good investment only in the right circumstances. In the meantime, organizations must build an internal culture in which the right data are regularly collected, analyzed, and applied to manage implementation and improve programs.”
A comprehensive summary of the authors’ recent book, in which they explain why impact evaluations are not always useful, describe ten scenarios in which alternatives to impact evaluations make sense, and provide four guiding principles for collecting data that will be used to inform decisions and improve programs.
“In contrast to theories of change that posit that more rigorous evidence will have a greater influence on officials, we have found the opposite to be true.”
Excerpts from a paper reflecting on ten years of trying to improve public health in Guatemala. A key takeaway from the paper is the more community members were involved in generating and presenting evidence, the greater the likelihood that it would be used to address service delivery challenges.
“If policymakers or those who support policy processes are interested in supporting evidence use and innovation during policymaking, they would be wise to consider amplifying the heterogeneity in a network by adding new actors from diverse backgrounds and finding ways for them to interact with other network actors.”
The authors explore the association between evidence use and policy network structure in Burkina Faso and suggest that heterogenous networks comprising different actors and organizations are likely to increase the use of evidence and policy innovation.
“…we will need to scratch beneath the surface to investigate the underlying assumptions that influence whether individuals and organizations actually use evidence in their work. These assumptions determine whether values become empty statements and artifacts gather dust or, ultimately, whether evidence use becomes a cultural norm.”
Policymakers often speak of creating a culture of evidence use and learning. In this blog, the Executive Director of Results for All describes different levels of culture, how they apply to evidence-informed policymaking, and six steps governments can take to shift their cultures toward evidence use.


Introducing…
Evidence Champion of the Month: Ebenezer Appah-Sampong
Deputy Executive Director for Technical Services, 
Environmental Protection Agency of Ghana

Established in 1994, the Environmental Protection Agency (EPA) is the leading public body for protecting and improving the environment in Ghana as well as seeking common solutions to global environmental problems. To diagnose and address environmental protection issues, the EPA needs to produce, source, analyze, and use a variety of evidence, including qualitative and quantitative research, monitoring and evaluation of programs, tacit knowledge from practitioners, and feedback from citizens. For example, evidence from drone mapping of small-scale mining sites was used in the development and implementation of a digital Compliance Monitoring System to address non-compliance of mining permit conditions, which was a major contributor to environmental degradation.

To better understand the factors affecting the use of evidence at the EPA, Mr. Sampong and his team worked with INASP and Politics & Ideas to pilot a diagnostic assessment and develop an Evidence-Informed Policymaking Change Plan. Mr. Sampong also led a team from EPA to participate in a peer learning workshop for government policymakers hosted by Results for All in July 2018, in which he discussed how better data collection and community engagement would be key to the success of a waste separation and recycling program in Ghana’s capital, Accra.

“To be evidence champion of the month means everything to me. It motivates me to do more and also encourage my peers in other agencies to see the value in evidence for their work. Evidence is the game changer in our efforts to become a global center of excellence in environmental protection.”

Evidence-Informed Policymaking Reading List | December 2018 — December 4, 2018

Evidence-Informed Policymaking Reading List | December 2018

What to Read this Month

“We propose that future programmes should consider multiple interventions that can be combined to meet needs as individuals, teams and organisations seek to increase their awareness, capability and implementation of evidence-informed decision-making.”

The authors, all researchers at the University of Johannesburg’s Africa Centre for Evidence, construct a model based on five dimensions: a cycle in which decision makers move between awareness of evidence-informed decision making (EIDM) to capability for EIDM and actual evidence use; the different needs and evidence use capabilities of individuals, teams, organizations, and institutions; different outputs related to increased capacity for EIDM, including but not limited to specific policy changes; influential contextual factors that influence evidence use; and a variety of interventions which can create incremental changes and complement each other.
“Despite these enormous investments in M&E systems, staff in donor and government agencies report little to no utilization of M&E data for decision-making.”
The authors discuss the importance of first understanding the available “decision space” in order to better anticipate data needs and system requirements, rather than starting with technical considerations of what data must be captured and how. They define decision space as  institutional policies, programmatic goals, individual operational tasks, incentives, and authority over financial and human and resources, and argue that this approach can increase the use of data for decision making.
“‘Facts are one part; just as guilt does not inspire initiative, people will not act on facts alone. We are inspired to act by emotional and physical experience.'”
An interesting exploration of how policymaking can borrow approaches from art to: improve how we understand and connect to data (design an interactive three dimensional visualization); explore how future research will be influenced by the culture of tomorrow’s scientists (consider a futuristic museum to stimulate thinking about the way research is currently conducted and how it might change in the future); and foster unconventional thinking (use a participatory arts approach to fill in a blank canvas).

Calls for Greater Role for Universities in Policy-making | Gilbert Nakweya, University World News

“Building the capacity of research in African universities would strengthen their contribution to policy-making and innovation…”
A brief article summarizing perspectives from the Sixth African Higher Education Week held recently in Nairobi, that makes the case for investing in African universities as research partners to inform development policy.
“A million-dollar evaluation of a home visiting program for new mothers might find it didn’t improve health outcomes – not because it’s a bad program, but simply because workers didn’t make all the scheduled visits. Costly, multi-year trials can end up revealing only that you can’t help people with a policy that’s not actually implemented.”
A new round of funding from the World Bank’s Strategic Impact Evaluation Fund (SIEF) will focus on quicker and lower-cost ‘nimble evaluations’ to test the effectiveness of different ways to implement policies and programs, rather than waiting until after they are implemented (or not) to measure their results.
“My point here is simple – while we should approach interactions with policymakers and practitioners with optimism, we should not expect it to work miracles or remove barriers that are actually present.”
The author argues that what is needed is not more study about solutions for improving the environmental science-to-policy interface, which don’t seem to be evolving, but action to address institutional barriers to progress and reform incentive structures.
“Evidence on its own is unlikely to foster change unless accompanied by effective campaigning, political mobilization and other forms of influencing.”
The article offers insights from Oxfam’s experience and highlights four strategies that have contributed to the effectiveness of its campaigns: 1) developing an understanding of a political system to understand what needs to change, who has the power to achieve change, and how to achieve change; 2) getting timing right, framing, and presenting evidence to maximize influence on target audiences; 3) drawing on a range of strategies to influence policy both on the inside and outside; and 4) reflection and iterative improvements.

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.
Evidence-Informed Policymaking Reading List | November 2018 — November 6, 2018

Evidence-Informed Policymaking Reading List | November 2018

What to Read this Month
“If policymakers or those who support policy processes are interested in supporting evidence use and innovation during policymaking, they would be wise to consider amplifying the heterogeneity in a network by adding new actors from diverse backgrounds and finding ways for them to interact with other network actors.”
The authors explore the association between evidence use and policy network structure in Burkina Faso and suggest that heterogenous networks comprising different actors and organizations are likely to increase the use of evidence and policy innovation.
“Geospatial impact evaluations (GIEs) use precisely georeferenced intervention data and outcome data to establish a counterfactual retroactively, eliminating the need to randomly assign individuals, firms, or communities into treatment and control groups at the outset of a program.”
A short article describing how USAID used geospatial data to understand the impact of a $900 million investment in rural infrastructure in the West Bank/Gaza. The evaluation team found strong evidence that local economic output, as measured by remotely sensed nighttime light intensity, increased as a result of the rural infrastructure program. The authors highlight GIEs as a promising approach for rigorously evaluating programs in fragile states where the collection of baseline and endline data is challenging and costly.
“Nevertheless, the process has taken a relatively long time and has required a very substantial body of evidence generated from interventions reliably funded by donors motivated for change, the implementation of multiple knowledge transfers strategies, the efforts of collective and individual political entrepreneurs, effective and patient advocacy coalitions, aided by a major window of opportunity that they seized and used to good effect.”
The authors – a researcher and policymaker team – conducted a reflective analysis of a major change in health financing policy in Burkina Faso between 2008 and 2018, which they supported in their respective roles. They share practical lessons for strengthening evidence-informed decision making, including the importance of: persistent and consistent production of rigorous and useful knowledge; fostering early interaction and engagement between the research and decision making communities; understanding the political and socio-economic context in which decisions are made; and seizing windows of opportunity for change.
 
“Without these forms of documentation, population statistics – which inform a range of policy decisions – are incomplete at best, and wrong at worst. Many low-income countries base their poverty estimates on data that is more than a decade old.”
The authors make a compelling case for policymakers to invest in data that help inform critical decisions like where to build schools or direct medical resources, and to make that data publicly accessible so that communities and social entrepreneurs can help identify solutions that work. “Policymaking without high-quality public data is governing by guesswork,” they write.

“In the current context of low priority, and weak institutional support and technical capacity to enable evidence use in decision making and debate in African parliaments, the network’s activities respond to some of the key barriers hindering parliamentarians from using evidence in their work.”
The authors describe their research to understand the contribution of the Network of African Parliamentary Committees on Health (NEAPACOH) to the evidence ecosystem in African parliaments. Annual network meetings serve as a platform for sharing evidence and building demand and capacity for increased use, strengthening partnerships between MPs and researchers, as well as creating a sense of accountability and competition for following through on commitments made each year. An additional key insight from their research is the importance of creating a mechanism for sharing forward commitments made at annually held regional workshops with national parliaments to make better progress in realizing them.

 

What We’re Working On

 

How are public sector officials incentivized to use evidence routinely in their work, whether to inform major policies and decisions, design or alter programs, or guide implementation? Our new series highlights strategies that government agencies around the world have used to create incentives for using evidence in decision making. Take a look at our first five case studies from Mexico, Sierra Leone, and South Africa!

 

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.

 

 

Evidence-Informed Policymaking Reading List | October 2018 —

Evidence-Informed Policymaking Reading List | October 2018

What to Read this Month

“To put it bluntly, decades of stunning progress in the fight against poverty and disease may be on the verge of stalling. This is because the poorest parts of the world are growing faster than everywhere else; more babies are being born in the places where it’s hardest to lead a healthy and productive life. If current trends continue, the number of poor people in the world will stop falling — and could even start to rise.”
A great report on the world’s progress and likelihood of achieving the Sustainable Development Goals, with easy-to-digest graphs on key indicators, and stories behind the data about family planning, HIV, education, and agriculture. Take a moment to test your own knowledge with the interactive, six-question “data check” quiz on global poverty and population trends.

“UIS estimates that solid data on learning — gauging whether policies and programs are working, or reforms are needed — could improve education spending efficiency by 5 percent, generating $30 million/year in savings in the average country, paying for the assessments hundreds of times over.”
The author stresses that three years after the SDGs were adopted, there are still 100 countries with no data on student learning, and that two international literacy and math tests offer good standards for measuring progress toward global education goals.

“The average cost of an ambulance ride is $500, and in 2017 the County was able to avoid approximately 1,300 unnecessary rides — a health care system cost-savings of approximately $260,000.”
This case study documents how health and human services officials teamed up with fire and rescue leaders to address the rising volume of 911 emergency medical services calls from a small number of frequent callers. By sharing data across agencies, partnering with local hospitals, and providing home visits to some high-frequency callers, the County has seen a more than 50 percent reduction in 911 calls from the residents engaged in the initiative, saving public resources while proactively providing residents with the services they need. To learn more about how US local governments are using data and evidence to improve government performance and outcomes for residents, see additional case studies here.
 
“Only by finding out what doesn’t work — and being transparent about it — can we identify where money can be saved and re-invested in effective interventions.”
The author shares five lessons from his work as policy advisor for the UK What Works Network: 1) RCTs are not the only way of assessing impact; 2) it can be socially acceptable to experiment on children (see Education Endowment Foundation’s work; 3) it is equally important to learn from what does not work; 4) evidence use doesn’t happen on its own; and 5) short-term effects do not necessarily translate to long-term outcomes.

What to Watch

“The way an issue passes from a vague idea into a piece of legislation or concrete sort of proposal shapes the kind of research you do…You need to consider where you are in this policy funnel, and that should shape the research.” (See 15:00 to 17:30)
Duncan Green, Oxfam Strategic Advisor, discusses how to understand the power and change processes within the system you are trying to influence, how to think about framing and timing of the issue, and how to combine research with media campaigns and lobbying to have policy influence.

A short online course for public servants to help you understand how to craft a narrative about your policy idea; write a compelling policy article or blog; or build a great presentation about your policy or idea.

What We’re Working On

We recently returned from Pretoria, South Africa, where we attended the Evidence 2018 conference hosted by the Africa Evidence Network, AEN. You can read about the event on the AEN blog and see pictures here. During the conference, we led two sessions on the potential of networks and peer learning opportunities to support government policymakers in advancing the use of evidence in decision making. You can read some of our thinking on the subject in a recent Results for All blog post, here.

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.
Evidence-Informed Policymaking Reading List | September 2018 — September 10, 2018

Evidence-Informed Policymaking Reading List | September 2018

Our New Summary Report is Here!

 

Summary Report Screenshot
What happens when teams of government policymakers from nine countries meet in Nairobi, Kenya to discuss using evidence to improve policy implementation?

The peer-learning exchange we hosted from July 23-25 featured sessions on the use of administrative data to inform policy implementation, how to incentivize evidence use and build an evaluative culture, and how to engage citizens in data collection, among others.

Curious about what happened in Nairobi, and what we learned? You can watch videos, download the summary report, and explore the insights here.

 

What to Read this Month

“When asked what helped or hindered them from applying the learning to their work, most participants with experience in government described obstacles due to the culture and leadership of the institution.”

This two-part blog series follows the partnership between MCC and the Government of El Salvador. The first installment describes the evidence workshop for policymakers held in July 2016 to discuss and promote the use of findings from evaluations of previous MCC-funded programs in the country. The second installment highlights interviews conducted with the organizers and participants, considering the impact on Salvadoran policymakers, what has changed in the last two years, and remaining challenges to evidence-informed policymaking in the country. The blogs are also cross-posted on MCC’s website.
“So the dashboard contains data about every aspect of education other than whether it is accomplishing its purpose: to teach children the skills and competencies that prepare them to be successful adults.”
Using a metaphor about a 2009 plane crash, Pritchett examines educational performance in Indonesia and India and argues that dashboards with too much information – particularly if focused on inputs without clear connections to priority outcomes – can be worse than having no dashboard at all.

“It is common knowledge within academic circles, regularly re-echoed at EIPM forums, that research produced by such institutions mostly in the form of theses and dissertations do not in any way inform policy.”
The author shares his perspective on the state of evidence-informed policymaking in Ghana, including challenges and promising opportunities to improve access to policy-relevant research and data, and the role of knowledge brokers.

“Societal inequality is exacerbated when one in five children worldwide do not complete upper primary school; understanding their motivations to drop out is crucial.”

The article describes the findings of research conducted to understand the impact of an Indian government initiative to build latrines in schools. Latrine construction positively impacted educational outcomes like enrolment, dropout rates, and number of students who appeared for and passed exams, for both boys and girls. The author’s research also shows that school sanitation only reduces gender disparities with the construction of sex-specific latrines for older age girls, while unisex latrines are mostly sufficient at younger ages.

“In short, I feel that economists need to be cautious and modest when it comes to giving policy advice, let alone getting actively involved in ‘policy design.'”
The author notes that giving advice requires more than evidence, as public policy is a reflection of values and objectives. Further, the advice one gives depends on who one advises, with economists representing one of many potential views. Additionally, giving advice requires a familiarity with the implementation of policy, which most economists do not have, and should be regarded as a political act more than a scientific one, requiring collaboration and partnership with a broad range of stakeholders.

“Information from the government, distributed by the UN in the midst of the crisis, on the other hand, was ‘completely off’, Bengtsson said.”
After negotiating legal agreements with mobile phone operators to access location data, a Swedish NGO is able to determine where people go during a crisis, which can help governments and aid agencies prepare and respond.

What to Watch

Could using video help you tell your stories, convey emotion, reach a wider audience, and help turn your research into policy impact? This webinar discusses when to create videos, where to use them to reach the target audience, whose story to tell, and how to distribute them, and includes a case study video production and dissemination plan.

What We’re Working On

Just 13 days until we leave for Pretoria, South Africa, where we’ll attend the AEN Evidence conference and present a strategy for a peer learning network to support policymakers in advancing the use of evidence in government. If you’ll be in Pretoria between September 24 and 28, let us know! We’d love to see you there.

Do you have comments, questions, or ideas for us? We always love hearing from you! Contact us at info@results4all.org anytime.

Evidence-Informed Policymaking Reading List | August 2018 — August 7, 2018

Evidence-Informed Policymaking Reading List | August 2018

Header

Just Concluded: Peer Learning Workshop on Evidence Use

Last month we hosted a workshop for teams of government policymakers from nine countries, providing a peer learning forum to share experiences, lessons, and strategies for using evidence more effectively in the implementation of social policies. You can learn about the workshop here and read our initial insights and reflections in our latest blog post.

Final Workshop Image with @#


What to Read this Month

“Administrative data can do much more than help deliver services or provide inputs for monitoring. We can use administrative data also for learning and research in humanitarian emergencies if agencies make available their data for analysis as part of an ethical, secure and deliberate strategy.”
A look at the strengths, weaknesses, opportunities, and threats of using administrative data in the humanitarian field. The main strength of using administrative data is that it is available immediately at no cost and can be used for research and learning. A common challenge, on the other hand, is that it can be difficult to harmonize administrative data from different sources.

“Even a cursory look at the literature shows that evidence-informed policy making is about more than merely the availability of knowledge items. You have to go beyond uploading documents on a server, to also build and use high-trust relationships.”
In addition to sharing relevant, useful and reliable knowledge via its online platform, the new South African SDG Hub aims to strengthen partnerships between policy actors and researchers and support capacity building to improve the use of evidence for SDG-relevant policymaking.

“After more than a year of executing 10-week projects, they’re starting to identify city trends, and getting results: After analyzing data on the relationship between education and income, for example, they increased federal financial aid sign-ups in the city by 10 percent.”
Made up of city residents and government workers, the Urban Data Pioneers volunteer group helps collect and analyze data on the city’s population, housing, education, and infrastructure, to advance the Tulsa mayor’s campaign promise to use data to focus on results.

“Many RFPs include some type of evidence to explain the scope of the problem the solicitation seeks to address within the country context (e.g., statistics showing low school attendance, disease prevalence rates). That makes for helpful background reading, but it doesn’t get at the crux of the matter-whether the intervention USAID is soliciting will plausibly achieve the intended outcomes. Far fewer RFPs offered evidence for this.”
A thoughtful article on how USAID – and other major development donors – could restructure bidding and procurement practices to 1) incorporate evidence into its Requests for Proposals and suggested program interventions, and 2) prioritize awarding contracts to implementing partners that use evidence to inform their proposals and program designs, and demonstrate a commitment to building the body of evidence on what works and why in global development.

Our colleagues at Results for America recently published an index showcasing how state governments are using data and evidence in budget, policy, and management decisions to achieve better outcomes for their residents. Criteria include Data Policies / Agreements, Data Use, Evaluation Resources, Cost-Benefit Analysis, Use of Evidence in Grant Programs, and Contracting for Outcomes.


What We’re Working On

We’re still processing our learnings from the peer learning workshop we hosted last month on using evidence for policy implementation, and will share more insights, reflections, and takeaways in a final report in the weeks to come. We’re also preparing a satellite session at the AEN Evidence conference in Pretoria next month, where we’ll discuss our learnings from the workshop and other activities from the past year, and present a strategy for a peer learning network to support policymakers in advancing the use of evidence in government.

Do you have comments, questions, or ideas for us? We always love hearing from you! Contact us at info@results4all.org anytime.

Evidence-Informed Policymaking Reading List | June 2018 — June 1, 2018

Evidence-Informed Policymaking Reading List | June 2018

What to Read this Month

 

“In contrast to theories of change that posit that more rigorous evidence will have a greater influence on officials, we have found the opposite to be true.”
Excerpts from a paper reflecting on ten years of trying to improve public health in Guatemala. A key takeaway from the paper is the more community members were involved in generating and presenting evidence, the greater the likelihood that it would be used to address service delivery challenges.

 

“Our argument is that to improve policy execution we must go one step further and consider how policies can be more effectively designed by connecting actors vertically and horizontally in a process of collaboration and joint deliberation.”
This article describes how collaboration in policy design can facilitate adaptive implementation of policy solutions. For example, collaboration with front line staff can help government leaders better understand challenges faced at the service delivery level and propose context specific solutions, while collaboration with citizens can generate constructive feedback that stimulates learning and incremental adjustments to a policy solution. The article also discusses how open and democratic political contexts enable collaborative policymaking.

 

“Neither country policy makers nor the global development community are best served by a global flood of health estimates derived from complex models as investments in country data collection, analytical capacity, and use are lagging.”
A brief commentary illustrating how the development community’s emphasis on global health predictions and estimates can contribute to a false level of certainty about health status and trends, and importantly, detract from needed investments in improving data collection and analytical capacity in countries.

 

“Impact evaluations are an important tool for learning about effective solutions to social problems, but they are a good investment only in the right circumstances. In the meantime, organizations must build an internal culture in which the right data are regularly collected, analyzed, and applied to manage implementation and improve programs.”
A comprehensive summary of the authors’ recent book, in which they explain why impact evaluations are not always useful, describe ten scenarios in which alternatives to impact evaluations make sense, and provide four guiding principles for collecting data that will be used to inform decisions and improve programs.

 

“Within a year of the research getting published, and publicised by media and labour unions, the Maharashtra government raised the minimum wage to INR 12. This benefited 6 million labourers in the state-many, many times more than what we could have hoped to achieve if we had not adopted a research-based approach to the problem.”
Reflecting on his experiences as a medical physician and researcher, Dr. Bang discusses the importance of research for development, and the difference between research on the people and research for and with the people.

 

“Even the simplest intervention is context dependent in countless, subtle ways – it’s impossible to say with certainty how it will fare in another place. However, as presented here, there’s a four-step framework that can disqualify many mistakes before they happen, and improve the odds of replications you pursue.”
Useful insights, examples, and a framework to determine when an intervention can be replicated in a new context: assess the evidence, understand what conditions made the intervention work in the original context, ensure that the new context has those same essential conditions, and adapt the intervention as necessary.

 

African Research Organizations: Submit your Proposals by June 15

 

The Hewlett Foundation call for proposals aims to advance government use of evidence by supporting East and West African policy research organizations. They can apply alone or lead a project with 1-2 partner organizations. See more here.

 

What We’re Working On

 

We’re busy organizing “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers,” which we are hosting in partnership with the African Institute for Development Policy (AFIDEP) and IDinsight from July 23-25 in Nairobi, Kenya.

 

The workshop will provide a forum for government teams from 9 countries to learn about and share experiences, challenges, good practices, and key lessons on how to use different types of evidence to overcome roadblocks, create political buy-in, engage with stakeholders, and mobilize financial resources to support and improve policy implementation. We selected 10 great teams from over 55 applications. A big thank you to everyone who applied and helped to spread the word! More info coming soon.