Evidence-Informed Policymaking Reading List | November 2018 — November 6, 2018

Evidence-Informed Policymaking Reading List | November 2018

What to Read this Month
“If policymakers or those who support policy processes are interested in supporting evidence use and innovation during policymaking, they would be wise to consider amplifying the heterogeneity in a network by adding new actors from diverse backgrounds and finding ways for them to interact with other network actors.”
The authors explore the association between evidence use and policy network structure in Burkina Faso and suggest that heterogenous networks comprising different actors and organizations are likely to increase the use of evidence and policy innovation.
“Geospatial impact evaluations (GIEs) use precisely georeferenced intervention data and outcome data to establish a counterfactual retroactively, eliminating the need to randomly assign individuals, firms, or communities into treatment and control groups at the outset of a program.”
A short article describing how USAID used geospatial data to understand the impact of a $900 million investment in rural infrastructure in the West Bank/Gaza. The evaluation team found strong evidence that local economic output, as measured by remotely sensed nighttime light intensity, increased as a result of the rural infrastructure program. The authors highlight GIEs as a promising approach for rigorously evaluating programs in fragile states where the collection of baseline and endline data is challenging and costly.
“Nevertheless, the process has taken a relatively long time and has required a very substantial body of evidence generated from interventions reliably funded by donors motivated for change, the implementation of multiple knowledge transfers strategies, the efforts of collective and individual political entrepreneurs, effective and patient advocacy coalitions, aided by a major window of opportunity that they seized and used to good effect.”
The authors – a researcher and policymaker team – conducted a reflective analysis of a major change in health financing policy in Burkina Faso between 2008 and 2018, which they supported in their respective roles. They share practical lessons for strengthening evidence-informed decision making, including the importance of: persistent and consistent production of rigorous and useful knowledge; fostering early interaction and engagement between the research and decision making communities; understanding the political and socio-economic context in which decisions are made; and seizing windows of opportunity for change.
 
“Without these forms of documentation, population statistics – which inform a range of policy decisions – are incomplete at best, and wrong at worst. Many low-income countries base their poverty estimates on data that is more than a decade old.”
The authors make a compelling case for policymakers to invest in data that help inform critical decisions like where to build schools or direct medical resources, and to make that data publicly accessible so that communities and social entrepreneurs can help identify solutions that work. “Policymaking without high-quality public data is governing by guesswork,” they write.

“In the current context of low priority, and weak institutional support and technical capacity to enable evidence use in decision making and debate in African parliaments, the network’s activities respond to some of the key barriers hindering parliamentarians from using evidence in their work.”
The authors describe their research to understand the contribution of the Network of African Parliamentary Committees on Health (NEAPACOH) to the evidence ecosystem in African parliaments. Annual network meetings serve as a platform for sharing evidence and building demand and capacity for increased use, strengthening partnerships between MPs and researchers, as well as creating a sense of accountability and competition for following through on commitments made each year. An additional key insight from their research is the importance of creating a mechanism for sharing forward commitments made at annually held regional workshops with national parliaments to make better progress in realizing them.

 

What We’re Working On

 

How are public sector officials incentivized to use evidence routinely in their work, whether to inform major policies and decisions, design or alter programs, or guide implementation? Our new series highlights strategies that government agencies around the world have used to create incentives for using evidence in decision making. Take a look at our first five case studies from Mexico, Sierra Leone, and South Africa!

 

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.

 

 

Advertisements
Evidence-Informed Policymaking Reading List | October 2018 —

Evidence-Informed Policymaking Reading List | October 2018

What to Read this Month

“To put it bluntly, decades of stunning progress in the fight against poverty and disease may be on the verge of stalling. This is because the poorest parts of the world are growing faster than everywhere else; more babies are being born in the places where it’s hardest to lead a healthy and productive life. If current trends continue, the number of poor people in the world will stop falling — and could even start to rise.”
A great report on the world’s progress and likelihood of achieving the Sustainable Development Goals, with easy-to-digest graphs on key indicators, and stories behind the data about family planning, HIV, education, and agriculture. Take a moment to test your own knowledge with the interactive, six-question “data check” quiz on global poverty and population trends.

“UIS estimates that solid data on learning — gauging whether policies and programs are working, or reforms are needed — could improve education spending efficiency by 5 percent, generating $30 million/year in savings in the average country, paying for the assessments hundreds of times over.”
The author stresses that three years after the SDGs were adopted, there are still 100 countries with no data on student learning, and that two international literacy and math tests offer good standards for measuring progress toward global education goals.

“The average cost of an ambulance ride is $500, and in 2017 the County was able to avoid approximately 1,300 unnecessary rides — a health care system cost-savings of approximately $260,000.”
This case study documents how health and human services officials teamed up with fire and rescue leaders to address the rising volume of 911 emergency medical services calls from a small number of frequent callers. By sharing data across agencies, partnering with local hospitals, and providing home visits to some high-frequency callers, the County has seen a more than 50 percent reduction in 911 calls from the residents engaged in the initiative, saving public resources while proactively providing residents with the services they need. To learn more about how US local governments are using data and evidence to improve government performance and outcomes for residents, see additional case studies here.
 
“Only by finding out what doesn’t work — and being transparent about it — can we identify where money can be saved and re-invested in effective interventions.”
The author shares five lessons from his work as policy advisor for the UK What Works Network: 1) RCTs are not the only way of assessing impact; 2) it can be socially acceptable to experiment on children (see Education Endowment Foundation’s work; 3) it is equally important to learn from what does not work; 4) evidence use doesn’t happen on its own; and 5) short-term effects do not necessarily translate to long-term outcomes.

What to Watch

“The way an issue passes from a vague idea into a piece of legislation or concrete sort of proposal shapes the kind of research you do…You need to consider where you are in this policy funnel, and that should shape the research.” (See 15:00 to 17:30)
Duncan Green, Oxfam Strategic Advisor, discusses how to understand the power and change processes within the system you are trying to influence, how to think about framing and timing of the issue, and how to combine research with media campaigns and lobbying to have policy influence.

A short online course for public servants to help you understand how to craft a narrative about your policy idea; write a compelling policy article or blog; or build a great presentation about your policy or idea.

What We’re Working On

We recently returned from Pretoria, South Africa, where we attended the Evidence 2018 conference hosted by the Africa Evidence Network, AEN. You can read about the event on the AEN blog and see pictures here. During the conference, we led two sessions on the potential of networks and peer learning opportunities to support government policymakers in advancing the use of evidence in decision making. You can read some of our thinking on the subject in a recent Results for All blog post, here.

Do you have comments, questions, or ideas for us?
We always love hearing from you! Contact us at info@results4all.org anytime.
Advancing evidence-informed policymaking in Africa: The role of peer learning networks — October 4, 2018

Advancing evidence-informed policymaking in Africa: The role of peer learning networks

Collage 4

The concept of peer learning has its roots in the classroom and can be best described as a reciprocal two-way sharing of knowledge, ideas, and experiences. Outside of a classroom context, peer learning-focused exchanges can take place through formal or informal networks of groups or individuals who have a shared history or purpose. Peer learning networks are not a new phenomenon, but recent years have seen a rise in learning-focused networks that facilitate a sharing of knowledge, tools, resources, and ideas among government policymakers working to advance national development priorities in Africa. The new crop of peer learning-focused networks are symbolic of a wide shift from expert-driven learning approaches to country- and problem-driven learning agendas. The members of these networks are typically drawn together to advance a common objective or goal, for example, promoting open government (Open Government Partnership), improving public financial management (Collaborative Africa Budget Reform Initiative), advancing universal health coverage (Joint Learning Network for Universal Health Coverage), strengthening evaluation systems (Twende Mbele), or increasing knowledge sharing in evidence-informed policymaking (Africa Evidence Network). The broad appeal of these peer networks is the learning that takes place among equals – there is an appreciation and expectation that everyone has something to share and learn – in a space that strengthens social trust and promotes tacit knowledge exchange and practical learn-by-doing approaches.

Peer learning networks and evidence-informed policymaking

In evidence-informed policymaking, decision-makers use the best available evidence to inform government policy and programmes. Evidence can be generated by research such as evaluations and rigorous studies; it can also include contextual evidence drawn from an analysis of surveys and administrative data; or experiential data that are based on feedback received from citizens. Although the way in which the key elements of the policy process are often described – typically some version of agenda setting, policy formulation, implementation, and monitoring and evaluation – suggests a rational and linear process. In reality however, policymaking unfolds as a complex and messy process involving many different actors. Importantly, beyond evidence, policymaking is influenced by the political, social, and economic context in which decisions are made, such as the openness in government, the pattern of election cycles, the level of citizen participation and the freedom of journalists.

Networks that support practical learning and the sharing of experiences are particularly suited to the uncertain, complex, and messy dimensions of the policy process, where there is no predefined one-size-fits-all solution to addressing a policy challenge. By facilitating a sharing of lessons learned, ideas and accomplishments in a space that builds trust and a deep sense of community, networks targeting decision-makers and the policy process have the potential to: 1) foster an openness to new strategies and approaches for advancing evidence use, put forth by trusted government peers who are regarded as equals; 2) deepen ownership of and commitment to evidence practices in respective government offices, engendered by belonging to a community of supportive peers who are grappling with similar challenges in integrating evidence into policy; and 3) spread, accelerate and normalise good practices for evidence use in government among members and their institutions.

Peer learning networks appear to take two complementary approaches to strengthening evidence use in government: building champions, and supporting systems change at the organisational and institutional levels. Both are needed to advance the use of evidence in policy.

Building champions

The practical learn-by-doing and problem-based approaches of a peer learning network can help to build policymaker knowledge, skill, confidence and motivation. Policymakers who are confident in their ability to find, appraise and use evidence, and who understand the complexities of the policymaking process, are more likely to champion and use evidence in decision-making. Through a network’s ability to function as a platform for sharing ideas, policymakers can be exposed to new ways of thinking that encourage a shift in government culture towards greater evidence use. Network members can become advocates for evidence-informed policymaking, persuading and inspiring others to become better and more systematic at using evidence to inform the decisions that affect the lives of their citizens. For instance, with its new Africa Evidence Leadership Award, the Africa Evidence Network is doing just this; spotlighting the work of champions who are committed to strengthening the use of evidence in policy, to raise awareness about evidence-informed policymaking across the continent. Peer pressure can also serve as a positive motivational force for policymakers to become better at finding and using evidence in policy and to bring new ideas and approaches to their work.

Supporting systems change

The knowledge gained from interactions with peers in a network can inspire policymakers to introduce new government systems and platforms to support evidence use. For example, through participation in the Evidence-Informed Policy Network (EVIPNet), the Ministry of Health in Malawi launched a Knowledge Translation Platform to improve the quality and accessibility of health research and strengthen partnerships between policymakers and the research community. In Sierra Leone, the Africa Cabinet Government Network (ACGN) supported the development of a new Cabinet Manual that requires ministries across the national government to provide evidence to support policy proposals. Policymaker participation in a network can also spur policy reform such as the revision of Ghana’s national health insurance policy to align Primary Health Care and Universal Health Care, spearheaded by practitioners in Ghana who participate in the Joint Learning Network for Universal Health Coverage.

The power of peer learning networks in Africa

While peer learning networks alone cannot address the many constraints policymakers face in generating, sharing, and using evidence, they deserve a featured place in the toolbox of promising approaches for accelerating the spread of evidence practices in governments across Africa. The evidence champion that a network cultivates can play a powerful role in demonstrating and promoting awareness about the value of using evidence, and advocating for improved evidence use in policy. But as others have noted, it is hard for these champions to translate knowledge into any type of action without organisational systems and institutional leadership and guidelines to incentivise and govern the use of evidence in policy. In the nascent field of evidence-informed policymaking there is clear demand and room for peer learning networks to support both champion building and systems change in Africa. The enthusiastic and engaged participation of evidence champions from Ghana, Kenya, Malawi, Nigeria, Rwanda, South Africa, and Uganda, in a recent workshop in Nairobi, Kenya to explore evidence use in policy implementation, only serves to confirm this demand and validate the potential of peer learning networks for advancing evidence-informed policymaking in Africa.

Abeba Taddese is the Executive Director, Results for All. Contact: abeba@results4all.org
This blog was originally published in AFIDEP’s September 2018 issue of African Development Perspectives and on the AFIDEP blog here.

Evidence-Informed Policymaking Reading List | September 2018 — September 10, 2018

Evidence-Informed Policymaking Reading List | September 2018

Our New Summary Report is Here!

 

Summary Report Screenshot
What happens when teams of government policymakers from nine countries meet in Nairobi, Kenya to discuss using evidence to improve policy implementation?

The peer-learning exchange we hosted from July 23-25 featured sessions on the use of administrative data to inform policy implementation, how to incentivize evidence use and build an evaluative culture, and how to engage citizens in data collection, among others.

Curious about what happened in Nairobi, and what we learned? You can watch videos, download the summary report, and explore the insights here.

 

What to Read this Month

“When asked what helped or hindered them from applying the learning to their work, most participants with experience in government described obstacles due to the culture and leadership of the institution.”

This two-part blog series follows the partnership between MCC and the Government of El Salvador. The first installment describes the evidence workshop for policymakers held in July 2016 to discuss and promote the use of findings from evaluations of previous MCC-funded programs in the country. The second installment highlights interviews conducted with the organizers and participants, considering the impact on Salvadoran policymakers, what has changed in the last two years, and remaining challenges to evidence-informed policymaking in the country. The blogs are also cross-posted on MCC’s website.
“So the dashboard contains data about every aspect of education other than whether it is accomplishing its purpose: to teach children the skills and competencies that prepare them to be successful adults.”
Using a metaphor about a 2009 plane crash, Pritchett examines educational performance in Indonesia and India and argues that dashboards with too much information – particularly if focused on inputs without clear connections to priority outcomes – can be worse than having no dashboard at all.

“It is common knowledge within academic circles, regularly re-echoed at EIPM forums, that research produced by such institutions mostly in the form of theses and dissertations do not in any way inform policy.”
The author shares his perspective on the state of evidence-informed policymaking in Ghana, including challenges and promising opportunities to improve access to policy-relevant research and data, and the role of knowledge brokers.

“Societal inequality is exacerbated when one in five children worldwide do not complete upper primary school; understanding their motivations to drop out is crucial.”

The article describes the findings of research conducted to understand the impact of an Indian government initiative to build latrines in schools. Latrine construction positively impacted educational outcomes like enrolment, dropout rates, and number of students who appeared for and passed exams, for both boys and girls. The author’s research also shows that school sanitation only reduces gender disparities with the construction of sex-specific latrines for older age girls, while unisex latrines are mostly sufficient at younger ages.

“In short, I feel that economists need to be cautious and modest when it comes to giving policy advice, let alone getting actively involved in ‘policy design.'”
The author notes that giving advice requires more than evidence, as public policy is a reflection of values and objectives. Further, the advice one gives depends on who one advises, with economists representing one of many potential views. Additionally, giving advice requires a familiarity with the implementation of policy, which most economists do not have, and should be regarded as a political act more than a scientific one, requiring collaboration and partnership with a broad range of stakeholders.

“Information from the government, distributed by the UN in the midst of the crisis, on the other hand, was ‘completely off’, Bengtsson said.”
After negotiating legal agreements with mobile phone operators to access location data, a Swedish NGO is able to determine where people go during a crisis, which can help governments and aid agencies prepare and respond.

What to Watch

Could using video help you tell your stories, convey emotion, reach a wider audience, and help turn your research into policy impact? This webinar discusses when to create videos, where to use them to reach the target audience, whose story to tell, and how to distribute them, and includes a case study video production and dissemination plan.

What We’re Working On

Just 13 days until we leave for Pretoria, South Africa, where we’ll attend the AEN Evidence conference and present a strategy for a peer learning network to support policymakers in advancing the use of evidence in government. If you’ll be in Pretoria between September 24 and 28, let us know! We’d love to see you there.

Do you have comments, questions, or ideas for us? We always love hearing from you! Contact us at info@results4all.org anytime.

Setting the Agenda for EIPM in Ghana: Common Trends and Recommendations — September 4, 2018

Setting the Agenda for EIPM in Ghana: Common Trends and Recommendations

Kirchuffs Atengble is the founder and Executive Director of PACKS Africa, a think tank operating from Accra, Ghana to improve the uptake of evidence in policymaking processes across the continent of Africa through information systems research and knowledge management.


Ghana Blog Cover

For the past six years or so, I have been involved in various ways with the organisation of programmes and events in the evidence-informed policymaking (EIPM) space in Ghana (read more about EIPM here). This July, for instance, I had the opportunity to serve as rapporteur, for the third time, for the biennial international conference organized by the Consortium of Academic and Research Libraries in Ghana (CARLIGH).

One thing I’d like to highlight is the notable momentum around efforts to advance the use of research and evidence in policy in Ghana, which is also reflected across the rest of the Africa region. This is an opportune time to explore challenges and opportunities for strengthening evidence use in policy across the continent. In this blog, I explore three core issues relevant to the EIPM agenda in Ghana.

A common reference point for available evidence

A common issue is the need for clear entry points to access evidence. Libraries in Ghana have been working over the years to promote access to content from their institutions, including academic and research articles and papers. It is common knowledge within academic circles, regularly re-echoed at EIPM forums, that research produced by such institutions mostly in the form of theses and dissertations do not in any way inform policy. The Association of African Universities (AAU) has been working to improve the policy relevance of these documents, but support has been very minimal.

With respect to government data, the Ghana Open Data Initiative (GODI), launched in 2012, seeks to make data from public agencies widely available. But as it stands, the available data on the platform is limited and out-of-date. A new initiative (the e-Transform project) is now being implemented in partnership with Mobile Web Ghana to revive GODI. A fully functional GODI is a great resource for stakeholders within the sector.

The initiatives mentioned above are a few of the ongoing projects designed to promote access to evidence for decision making. But how are policymakers to find and navigate these different platforms? I suggest a common platform to consolidate information generated on a range of topics from different sources, and serve as a one-stop-shop for policymakers. Such a platform would serve as a portal, combining different access points (not replicating data collection or producing new studies), and structure access according to thematic issues for easy retrieval by policymakers, citizens, academics, development partners, and other interested stakeholders. An example of such a portal is the AGORA portal of the Research4life programme.

Cultivating knowledge brokers

Academics and researchers have a tendency to lament that their studies are not used to inform policy. Often this is because the research is not accessible to policymakers, both physically and intellectually. Policymakers have very limited time available to digest voluminous publications and need information that is presented in a user-friendly and easy-to-understand format.

Different initiatives are helping to develop the capacity of knowledge producers, enabling them to repackage resources for easy assimilation by the targeted beneficiaries. Others are focused on developing the skillsets of policymakers and their support staff to enable them to undertake such activities from the policymaking perspective.

These are all laudable initiatives. But it is important to also consider developing the skillsets of existing intermediaries, such as library professionals, communication specialists, and knowledge/evidence aggregators, who can become knowledge brokers for the sector.

To achieve efficiency and effectiveness within the ever-evolving knowledge economy of today will require specialisation. Knowledge producers should be allowed to concentrate on the core of their work, and policymakers the same. The brokering work should be left for the professionals. Here, I point to ideas like the rapid synthesis services offered by the knowledge translation and systematic review group at Makerere University. Sarah Quarmby shares her experience with the Wales Centre for Public Policy.

Business process re-engineering for public agencies and cultural change

Having described the different initiatives underway to strengthen evidence use in Ghana and offered suggestions for improving evidence practices in government, I would now like to draw attention to the institutional processes of policymaking in the country. The prevailing processes make it difficult for policymakers to engage routinely with evidence. For example, there have been instances cited that suggest that the core units involved in EIPM within Ministries in Ghana, such as the Research, Statistics, and Information Management (RSIM) and Policy Planning, Monitoring and Evaluation (PPME) directorates, have very little (if any) collaboration.

Further, some institutions are staffed by relatives and close associates of influential people in society, who may be more interested in status rather than the opportunity to improve policies or programs. This can create a problem for the entire institution, particularly if decisions are based on instinct and the use of evidence is not prioritized. This increases the risk for policies and programs to be developed with the wrong assumptions.

Wouldn’t you want to see the re-structuring of business processes, to enable the consultation of appropriate and adequate evidence from available sources during policymaking – both internally in Ministries, Departments and Agencies, and externally among sector players? It’s difficult to give a perfect example of what this could look like, but the growing collaboration between the Parliament of Uganda’s Department of Research Services (DRS) and the Ugandan National Academy of Sciences (UNAS) is a promising case to highlight.

Conclusion

To end on an encouraging note, it is worth mentioning that the Ghana Health Service, together with other sector players, has developed the very first Research Agenda for the health sector. This is the kind of pace-setting initiative that I would like to see more of to improve the uptake of evidence in policy processes in Ghana.


About the author: Kirchuffs Atengble is the founder and Executive Director of PACKS Africa, a think tank operating from Accra, Ghana to improve the uptake of evidence in policymaking processes across the continent of Africa through information systems research and knowledge management. Mr. Atengble was a partner of the VakaYiko consortium in Ghana, which received funding from the UK’s Department for International Development (DfID) to improve research uptake across three other countries – South Africa, Uganda and Zimbabwe.

Supporting Evidence Use by Policymakers in El Salvador: Following Up on MCC Activities 2 Years Later | Part 2 of 2 — August 27, 2018

Supporting Evidence Use by Policymakers in El Salvador: Following Up on MCC Activities 2 Years Later | Part 2 of 2

2017 Training Picture

The June 2017 M&E training. Source: FOMILENIO II

This is the second installment of a two-part blog series by Results for All describing the partnership between MCC and the Government of El Salvador. The first installment described the evidence workshop for policymakers held in July 2016 to discuss and promote the use of findings from evaluations of previous MCC-funded programs in the country, as well as the M&E trainings for public officials held the following year. This second installment highlights interviews conducted with the organizers, trainers, and participants. It considers the impact of the workshop and trainings on Salvadoran policymakers, what has changed in the last two years, and remaining challenges to evidence-informed policymaking in the country. Note that the majority of interviews consisted of written questions and responses. Quotes used here have been translated from Spanish.

“The purpose of the Evidence Workshop was to extract the evidence produced from the MCC evaluations, as well as the findings of the empirical investigations and evaluations of other donors, and to put the results into use in the processes and decisions important for El Salvador, including in policy, strategy development, or project design in the sectors of education, infrastructure, investment climate, and possibly in the security sector.” – FOMILENIO II

What Contributed to Successful Implementation?

The evidence workshop took place within two important contexts. First, MCC committed to implementing evidence workshops in partner countries with compact or threshold programs, and the El Salvador workshop acted as a pilot for this effort. Second, a 2015 reform of the executive branch in El Salvador strengthened the role of the Presidency’s Technical and Planning Secretariat, SETEPLAN, in directing the Five-Year Development Plan, organizing the National Planning System, and operating the Monitoring and Evaluation (M&E) Subsystem. To fulfill the latter mandate, SETEPLAN identified a need to strengthen the M&E skills of key personnel in government, civil society institutions, and academia. The goals of MCC and SETEPLAN thus aligned well, and their strong pre-existing relationship facilitated the planning of the 2016 evidence workshop and 2017 M&E trainings.

Workshop organizers at the MCC El Salvador Country Team, FOMILENIO II, and SETEPLAN further stressed that implementation was facilitated by “MCC’s commitment to promote rigorous evaluations and put lessons learned into practice” as well as the fact that high-level authorities from both MCC and the Government of El Salvador supported the workshop early on and attended in person.

How Did Participants Apply Learnings to Their Work?

One government stakeholder described how  the Government of El Salvador identified and established M&E indicators for national planning instruments, such as the Five-Year Development Plan, as a result of the capacity building and knowledge acquired. The government also intends to use these lessons learned to achieve the Sustainable Development Goals and address multidimensional poverty. Another key stakeholder from government stated that “I believe that there have been important advances in the outcome indicators” used to measure government work.

Several of the participants who attended the 2016 evidence workshop came from academia, and highlighted in interviews how they have used the impact evaluations presented at the workshop in the university courses they teach, enriching the classes “with concrete examples.” Another participant from academia mentioned using learning from the 2017 M&E training when conducting a project evaluation for a nongovernmental entity. Another wrote that unemployment hindered her from applying the learning from the events.

Results of the evaluations of the 2017 M&E trainings found that participants especially appreciated modules on Theory of Change for policies and programs, when to use different types of evaluations (needs assessments, impact evaluations, process evaluations, and cost-effectiveness analysis), and what questions each can help answer.

When asked what helped or hindered them from applying the learning to their work, most participants with experience in government described obstacles due to the culture and leadership of the institution. One wrote that “It has been necessary to develop a culture around the evidence. The resistance in this sense is quite strong. The systems for capturing information have their limitations, which are very difficult to overcome in times when resources are scarce. In this process it has been important to count on the support of the authorities.” Another put it this way: “Considering that I have been at the front of M&E units, I can tell you that knowledge is easier to apply from an institution that gives importance to evaluation. Many times the obstacles come from the bosses, the superiors who are not interested in the issues, much less the real results and in part this is because they do not know about the subject.” One respondent noted that while the strong technical team within SETEPLAN “is receptive to MCC’s findings, the current political and fiscal climate has hindered full adoption and implementation of these lessons learned.”

What Are the Main Challenges to Using Evidence for Policymaking in El Salvador?

Regarding challenges to evidence use in general, participants spoke overwhelmingly about the need to create an evaluation and learning culture within their institutions. “It is necessary to create the culture of M&E in our professionals […] It is necessary to stop seeing evaluation as a way to measure and punish the employee,” wrote one participant. Similarly, another interviewee explained how “evaluation can be seen as an audit and not as a learning and improvement process.” Another stated that “The information systems that are available do not respond to current needs and the evolution of the programs. Nor is there a culture of evaluation and many of the processes are susceptible to human errors due to lack of systematization.” The frequent turnover of governments and personnel exacerbates efforts to create an evidence culture, and necessitates “continuously sensitizing decision-makers and managers of public policies, programs and projects on the importance of the generation and use of evidence,” according to another stakeholder from government.

In addition to the lack of a culture of using evidence and evaluations, access to relevant evaluations or updated statistics was a commonly cited challenge, especially for participants from outside government. Other challenges cited include:

  • “lack of knowledge about how the results of an evaluation can be concretized into actions for continuous improvement of the work of government”
  • “availability of quality and timely information in the short term for public decision-making”
  • “to consolidate an effective M&E System that transcends periods of government” and
  • how to ensure that “the information generated by the M&E System is actually used for decision making.”

There may also be an assumption that evaluations are always costly and take a long time. One interviewee explained that policymakers often work with consultants who over-charge for evaluations of sub-par quality. Lastly, one interviewee noted that with El Salvador’s current financial conditions, the government may have knowledge of the highest-impact interventions, yet lack the funds to implement them.

What is the Attitude Toward Evidence Use in El Salvador, and What Has Changed in the Last Two Years?

On the whole, workshop organizers and participants stressed that the mood has shifted in El Salvador, and that while room for improvement remains, significant advances have been made in the last two years to institutionalize evidence use, and to view evidence and evaluations as positive and essential tools for government. One participant described a shift in measuring impact, from purely qualitative to quantitative evaluations using the counterfactual. Another wrote that “there are mechanisms already institutionalized to publicize the results that are available. A lot of attention and resources have also been given to evaluation to be able to continue improving interventions.” Several others highlighted greater interest in or commitment by public institutions to measure results and open themselves to citizen participation. The fact that the M&E trainings had participation from top-level ministerial staff all the way down to program staff is a clear sign of the government’s commitment to use evidence in a more systematic way, declared one stakeholder.

Several participants spoke of remaining challenges, pointing mainly to limited resources for conducting evaluations, and the political environment. As one stakeholder wrote, “The technical offices within the government are receptive toward the use of evidence-based decision making, but the electoral environment and political leadership of both sides of the aisle favors decision making using a different calculus.” Another put it more frankly, “the bosses do not like to know that things go wrong and that they should be adjusted.” However, just as a different interviewee observed, the fear of admitting that things have gone wrong is not unique to El Salvador.

Onwards and Upwards: Catalyzing Continued Culture Change for Evidence Use

Overall, the organizers and participants of the 2016 evidence workshop and 2017 M&E trainings spoke positively of the events and the relationship between MCC and the Government of El Salvador that made them possible. Most participants noted that they were able to apply their learning in some capacity in their work, though how they are doing so differed widely, from better identifying indicators for national planning, to using concrete examples of evaluations in university teaching. Organizers and participants had similar responses when asked about challenges to evidence-informed policymaking in El Salvador: nearly every interviewee emphasized, at some point, the importance of developing institutional cultures of evidence use and learning. That includes by sensitizing government authorities, who participants explained can present obstacles when they lack familiarity with evidence methodologies or topical issues, when they see evaluation as primarily an audit and not a learning tool, when they are reluctant to see negative results, or when they devalue evidence compared to political or electoral priorities.

At Results for All, we have studied many other training programs and initiatives that aim to build policymaker knowledge, skill, and motivation to use evidence in government. The challenges cited by stakeholders interviewed for this blog resonate with our research – namely that institutional cultures, political leadership, and staff turnover can impede the effectiveness of isolated training programs or events that focus on individual government personnel. Instead, we find that creating cultures of evidence use, and institutionalizing evidence-informed policymaking in government, requires the right incentives.

In the United States, our colleagues at Results for America help incentivize political leaders to become evidence champions by highlighting their work in the media (policymakers love press) and recruiting them to join peer learning and advocacy networks. At Results for All, we recently hosted a peer learning workshop for teams of government policymakers from nine countries, focused on using evidence to improve policy implementation. We saw how the opportunity to share and learn from their peers motivated teams of government policymakers from around the world to apply for the workshop. We found that policymakers really want to learn about and create guidelines, policies, and frameworks that can incentivize, systematize, and govern evidence use in their institutions. Lastly, we witnessed strong demand for a sustained network of evidence champions, one that could provide a powerful global platform to advocate for and incentivize cultures of evidence-informed policymaking and learning in government.

Over the next few months, we will publish a series of briefs and case studies on mechanisms to incentivize evidence use in government. We are also shaping a strategy for a global peer learning network on evidence-informed policymaking. The network has the potential to unite leaders from governments like El Salvador to highlight, disperse, and deepen existing evidence practices, pair governments grappling with similar challenges, and jointly develop guidelines and tools to govern evidence use and incentivize continued progress.


Ari Gandolfo is Projects and Partnerships Manager at Results for All, a global initiative dedicated to helping policymakers use data and evidence to improve the lives of citizens. We are committed to highlighting and accelerating the spread of good practices for evidence use in government, developing tools and resources to support policymakers, and building innovative partnerships and networks to recognize and promote peer learning among evidence champions in government.

Do you have comments, questions, or ideas for us? Do you have experience with incentives in the public sector that could help to strengthen and make evidence use in policymaking routine? You can reach us at info@results4all.org.

 

 

Supporting Evidence Use by Policymakers in El Salvador: Following Up on MCC Activities 2 Years Later | Part 1 of 2 — August 20, 2018

Supporting Evidence Use by Policymakers in El Salvador: Following Up on MCC Activities 2 Years Later | Part 1 of 2

2016 Workshop PictureThe 2016 evidence workshop. Source: FOMILENIO II

Thanks to a commitment to monitor and evaluate the impact of its investments, and a focus on using data and evidence to get results, the Millennium Challenge Corporation (MCC) generates a lot of evidence on “what works” to improve economic growth and development in its partner countries in Africa, Asia, Eastern Europe, and Latin America. However, do partner country governments use the evidence generated by MCC-funded programs to inform their policies and planning? Or are those valuable lessons on “what works” forgotten as MCC and its partners move on to new priorities? In July 2016, MCC held a workshop in El Salvador intended to make better use of the evidence generated by its own investments in the country, and followed up with a Monitoring and Evaluation training for policymakers one year later. How did participants apply the evidence and learning to their work, and what helped or hindered them from doing so? What has changed in the two years since the initial workshop, and what challenges remain for using evidence in government policy?

At Results for All, we are committed to shining a spotlight on good practice initiatives that promote the systematic use of evidence in government decision-making. We have studied other evidence use training programs, such as this one in Ghana, and interrogated to what extent individual policymakers are able to apply what they learn to their work, noting the challenge of creating lasting impact with these programs. We wanted to help MCC tell the story of their work in El Salvador, see what lessons from this experience resonate with our previous research, and identify what insights could be shared to advance evidence-informed policymaking in other contexts. In this two-part blog series, we describe the programs held in El Salvador in 2016 and 2017, follow up with Salvadoran policymakers and partners to consider the impact of the activities, and discuss implications for furthering evidence-informed policymaking in El Salvador and other countries.

Development Assistance Grounded in Partnership: The MCC Model in El Salvador

The Millennium Challenge Corporation (MCC) is a foreign aid agency of the U.S. government, created by Congress in 2004, which supports economic growth, poverty reduction, and institutional strengthening in developing countries with good governance. By focusing on selective countries with good policies, and emphasizing country ownership, evidence-based programs, and rigorous monitoring and evaluation, the MCC model helps to ensure that U.S. dollars are well spent to get cost-effective results.

How it works: MCC provides time-limited grants to competitively selected partner countries. To be eligible for the five-year “compacts,” countries must pass the MCC Scorecard, a set of independent (non-MCC) indicators related to good governance, economic freedom, and investing in citizens. Countries close to passing the eligibility criteria can receive small “threshold” grants to support policy and institutional reform, to help improve their policy performance in key areas and work towards future collaboration with MCC. Once selected, partner country governments work with MCC to design programs that align with national development priorities, and refine plans through consultations with civil society and the private sector. Countries must then establish a Millennium Challenge Account (MCA), a local accountable government entity responsible for implementing the grant funds, which are also subject to rigorous and independent monitoring and evaluation to assess the impacts of the MCC-funded programs.

MCC in El Salvador: MCC has awarded two five-year compacts to El Salvador. The first, from 2007 to 2012, invested $461 million in education, public services, agricultural production, rural business development, and transportation infrastructure. The second compact, active from 2015 to 2020, will invest up to $277 million (with an additional $88 million contributed by the Government of El Salvador) in regulatory reforms, education, and logistical infrastructure to increase El Salvador’s productivity and competitiveness in international markets, in order to promote economic growth and private investment in the country.

The compacts are implemented by Fondo del Milenio, commonly referred to as FOMILENIO, the Millennium Challenge Account (MCA) or entity responsible for the MCC investments in El Salvador. For the second compact, FOMILENIO II is led by the Presidency’s Technical and Planning Secretariat (SETEPLAN, for its Spanish acronym) and includes representatives from federal ministries, civil society and academia, the private sector, subnational governments, and MCC.

A Bigger Bang for the Buck: A Workshop to Make Better Use of Evidence Already Generated

Early in the second compact, MCC and FOMILENIO II hosted a workshop (Spanish) entitled “Closing the Gap: Strengthening the Ties between Evaluation and Policy,” which aimed to promote the use of evidence generated through MCC investments in the design and implementation of new government programs and policies. In essence, the workshop sought to take stock of the knowledge acquired through previous MCC-funded work in El Salvador, and help Salvadoran policymakers identify where it could be applied to make further progress on government priorities.

Over 180 policymakers, practitioners, and researchers attended the event, held in San Salvador on July 28, 2016. Plenary sessions discussed MCC and FOMILENIO I achievements and the importance of using impact evaluations for policy, planning, and budgeting purposes. Representatives from MCC, the World Bank, Inter-American Development Bank, Abdul Latif Jameel Poverty Action Lab (J-PAL), USAID, Mathematica Policy Research, and other partners then presented the results of the evaluations of MCC-funded projects to improve education and El Salvador’s investment climate. Participants discussed the findings and ways to incorporate that knowledge into new programs and policies.

“As part of the workshop, participants committed to use the lessons learned to improve education, gender, and legal and regulatory policy to make the business climate more competitive and help ensure that better educated students can find higher paying jobs in El Salvador.” (MCC’s Statistically Speaking Newsletter, January 2017)

The evidence workshop provides a good example of collaboration between MCC and an in-country Millennium Challenge Account and offers a relatively simple strategy to promote evidence-informed policymaking – and a continued return on MCC investments – by jointly brainstorming how to make better use of evidence already generated by MCC-funded programs.


Findings from MCC’s First Compact with El Salvador: Supporting Technical Education

At the July evidence workshop, MCC and FOMILENIO partners presented final evaluation findings from already concluded MCC-funded programs. For example, Mathematica Policy Research presented findings from the evaluation of the Formal Technical Education Sub-activity. The sub-activity included a scholarship program meant to increase graduation rates from post-secondary technical-vocational schools. The results? Boys who were given scholarships to the technical-vocational schools were more likely to graduate compared to boys who did not receive scholarships. Their graduation rates, 80% vs 63%, differed by a statistically significant 27%, suggesting that scholarships are an effective means to increase boys’ graduation rates. However, the difference for girls, 77% vs 76%, was not significant. Why the variation? We can hypothesize that scholarships may serve as a stronger motivator for boys than girls because they reduce boys’ incentives to emigrate or find low-skilled work to provide for their families, which is common in El Salvador.

Importantly, the evaluation also found that graduation from a technical-vocational school did not guarantee higher income, sometimes because the skills taught by the program did not match those needed by local employers. Early collaboration with the private sector in the design of technical curriculum is therefore essential.


Enabling Evidence Use: Tiered M&E Training for Public Officials

After the initial evidence workshop, the Government of El Salvador requested a certificate course for relevant ministries and implementing entities, to improve the M&E capacity in government. In response, MCC and FOMILENIO II worked with J-PAL to deliver three sets of trainings (Spanish) from June to August 2017. The trainings aimed to show how impact evaluation findings and other evidence can be used to improve public policies and results for citizens, identify challenges to evidence-informed policymaking and strategies to overcome them, and enhance relevant skills in monitoring and evaluation. The first targeted high-level public officials, principally ministers and directors, while the second brought in mid-level public officials responsible for project design and implementation and included more technical information. The third training was designed for M&E specialists and technical staff, with more emphasis on statistics and econometrics and other skills needed to conduct, coordinate, or supervise the evaluation of public policies and projects.

In addition to enhancing the skills of participants, the training was meant to create a cadre of professionals that could form a potential Monitoring and Evaluation Committee supported by the Presidency’s Technical and Planning Secretariat, a plan that could help to institutionalize the use of evidence in policy and planning decisions.

Beyond Sharing Evidence and Building Skills: Assessing Changes Two Year Later

This blog introduced the partnership between MCC and the Government of El Salvador, and described the evidence workshop held in July 2016 and follow-on trainings held one year later. How did policymakers in El Salvador apply what they learned in the workshop and training to their work in government? What helped or hindered them from doing so? How have their attitudes toward evidence and their motivation to use it changed in the last two years, and why? To answer these questions, the second blog in this series features interviews with policymakers who organized and participated in the workshop and trainings, and discusses what remaining challenges impede policymakers from using evidence in El Salvador, and how MCC and other donors can help.


Ari Gandolfo is Projects and Partnerships Manager at Results for All, a global initiative dedicated to helping policymakers use data and evidence to improve the lives of citizens. We are committed to highlighting and accelerating the spread of good practices for evidence use in government, developing tools and resources to support policymakers, and building innovative partnerships and networks to recognize and promote peer learning among evidence champions in government. You can find other training programs and evidence-informed policymaking initiatives we have studied here. In particular, you may want to look at this case study on evidence use training for civil servants in Ghana, or the many other programs we summarize in this table that aim to build policymaker knowledge, skill, and motivation to use evidence in government.

 

 

 

Evidence-Informed Policymaking Reading List | August 2018 — August 7, 2018

Evidence-Informed Policymaking Reading List | August 2018

Header

Just Concluded: Peer Learning Workshop on Evidence Use

Last month we hosted a workshop for teams of government policymakers from nine countries, providing a peer learning forum to share experiences, lessons, and strategies for using evidence more effectively in the implementation of social policies. You can learn about the workshop here and read our initial insights and reflections in our latest blog post.

Final Workshop Image with @#


What to Read this Month

“Administrative data can do much more than help deliver services or provide inputs for monitoring. We can use administrative data also for learning and research in humanitarian emergencies if agencies make available their data for analysis as part of an ethical, secure and deliberate strategy.”
A look at the strengths, weaknesses, opportunities, and threats of using administrative data in the humanitarian field. The main strength of using administrative data is that it is available immediately at no cost and can be used for research and learning. A common challenge, on the other hand, is that it can be difficult to harmonize administrative data from different sources.

“Even a cursory look at the literature shows that evidence-informed policy making is about more than merely the availability of knowledge items. You have to go beyond uploading documents on a server, to also build and use high-trust relationships.”
In addition to sharing relevant, useful and reliable knowledge via its online platform, the new South African SDG Hub aims to strengthen partnerships between policy actors and researchers and support capacity building to improve the use of evidence for SDG-relevant policymaking.

“After more than a year of executing 10-week projects, they’re starting to identify city trends, and getting results: After analyzing data on the relationship between education and income, for example, they increased federal financial aid sign-ups in the city by 10 percent.”
Made up of city residents and government workers, the Urban Data Pioneers volunteer group helps collect and analyze data on the city’s population, housing, education, and infrastructure, to advance the Tulsa mayor’s campaign promise to use data to focus on results.

“Many RFPs include some type of evidence to explain the scope of the problem the solicitation seeks to address within the country context (e.g., statistics showing low school attendance, disease prevalence rates). That makes for helpful background reading, but it doesn’t get at the crux of the matter-whether the intervention USAID is soliciting will plausibly achieve the intended outcomes. Far fewer RFPs offered evidence for this.”
A thoughtful article on how USAID – and other major development donors – could restructure bidding and procurement practices to 1) incorporate evidence into its Requests for Proposals and suggested program interventions, and 2) prioritize awarding contracts to implementing partners that use evidence to inform their proposals and program designs, and demonstrate a commitment to building the body of evidence on what works and why in global development.

Our colleagues at Results for America recently published an index showcasing how state governments are using data and evidence in budget, policy, and management decisions to achieve better outcomes for their residents. Criteria include Data Policies / Agreements, Data Use, Evaluation Resources, Cost-Benefit Analysis, Use of Evidence in Grant Programs, and Contracting for Outcomes.


What We’re Working On

We’re still processing our learnings from the peer learning workshop we hosted last month on using evidence for policy implementation, and will share more insights, reflections, and takeaways in a final report in the weeks to come. We’re also preparing a satellite session at the AEN Evidence conference in Pretoria next month, where we’ll discuss our learnings from the workshop and other activities from the past year, and present a strategy for a peer learning network to support policymakers in advancing the use of evidence in government.

Do you have comments, questions, or ideas for us? We always love hearing from you! Contact us at info@results4all.org anytime.

Where do policymakers go when they need a safe space to engage with each other on evidence use in policy implementation? — August 6, 2018

Where do policymakers go when they need a safe space to engage with each other on evidence use in policy implementation?

They come together in a workshop where they can share experiences, lessons, and strategies for using evidence more effectively in the implementation of social policies.

m0OEI_cw_edit.jpeg.jpg

On July 23-25, Results for All and our partners at AFIDEP and IDinsight hosted “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” in Nairobi, Kenya, providing such a space for ten teams of evidence champions and policymakers from nine countries: Chile, Ghana, Kenya, Malawi, Mexico, Nigeria, Rwanda, South Africa, and Uganda. Each government team is tasked with implementing a specific social policy, such as increasing the quality of public education, meeting family planning targets, and supporting the most vulnerable households, and sought to use this opportunity to inform that work. To learn more, take a look at the workshop agenda, and read about the participating teams and the policies they are working on in this set of short policy briefs we wrote together. You can also see photos from the workshop and several video interviews with participants on our Twitter.

Final Workshop Image

Why a focus on policy implementation?

Over the last few months, we’ve been engaged in a series of consultations to assess the demand for a global evidence network and understand how funders are supporting evidence use to inform government priorities. A consistent theme in our conversations has been the lack of attention given to policy implementation or translation of policy to action. We also heard from many participants that the specific focus on evidence use in policy implementation is what drew them to apply for and participate in the workshop.

Policy implementation challenges can occur due to a myriad of factors, including unclear policy goals and outcomes; an absence of political support or financial resources; missing or weak evidence on the effectiveness of an intervention; inadequate skills or motivation among public officials tasked with frontline service delivery; and incorrect assumptions about human behavior and local needs. Addressing these implementation challenges requires a variety of evidence: evidence on how to mobilize political and financial support for the policy; evidence on whether the policy has worked elsewhere and under what conditions; evidence on how to enable and incentivize frontline agents to best implement and track the policy; and evidence from local stakeholders to best tailor the policy to their context and needs.

When implementation, along with monitoring and evaluation activities, is not linked to policy design but instead treated as a distinct down-stream activity, the incentive to produce evidence in an ongoing and iterative process to inform policy is weak. This puts evidence-informed policymaking at risk. Policymakers can only ensure the benefits of evidence-informed policymaking when implementation succeeds. We think it is critical therefore, for governments to take a systematic and structured approach to using evidence to bridge the gap between policy design and implementation, to achieve better results for the people they represent and serve.

Initial insights and reflections

We’re still processing our learnings from the workshop and will share a lot more in the weeks to come, but here are some initial takeaways.

1: Common evidence use challenges persist across diverse contexts

We were not surprised to learn of the many common challenges that workshop participants face in using evidence to inform policy implementation, regardless of the specific policy, sector, or country context. These include the lack of a learning and results-oriented evaluation culture; the difficulties associated with integrating and using data across the multitude of agencies working to address complex social problems; the challenge of turning raw data into useable information; the absence of structured partnerships with the research community and media; and a lack of tools and understanding on not only how to engage with citizens, but importantly on how to use the inputs that they provide to improve policy implementation.

2: A safe space for sharing challenges, experiences, and accomplishments is attractive to policymakers, even those with more advanced evidence use

In our consultations we also heard a lot of interest in peer learning and networking between governments, and we received a lot of interest in this workshop due, we think, to the fact that peer learning was a central theme. We wanted to test whether policymakers from a diverse set of countries, most of whom had never met before, could connect over common missions and challenges, openly share their successes and failures, and provide real value to each other’s work. The verdict thus far is yes: participants have told us over and over what a great opportunity it was to connect with others in government, and to now have a network of international peers with whom they can discuss ideas and share resources. We confess that we were skeptical that representatives from some countries – especially Mexico, Chile, and South Africa, with their very advanced evaluation systems – would benefit from being in the room as much as others – but participants have reiterated that the workshop gave them a lot to think about and apply to their work. Some noted that this was a great opportunity to showcase their country’s learning and growth around evidence use, and that they were interested in more forums that provided this platform.

3: Participants want more practical, hands-on tools with immediate applications to their work

Workshop participants were especially keen to use tools, like checklists, behavioral insights tools, and design thinking, that helped them reflect on their own experiences, map out connections, and chart a way forward. This interest signals the potential for future network activities focused on jointly developing model policies, frameworks, or guidelines for evidence production and use, which participants could then adapt to their own contexts.

qvAr3SWw_edit.jpeg

4: Connecting government and civil society is a valuable function for a network or community of practice

We designed this workshop as a forum primarily for government policymakers to learn from each other. However, one of the most popular sessions was a ‘marketplace of citizen engagement solutions,’ where we invited eight non-governmental organizations from Nairobi to set up booths and showcase their work to participants. The organizations – Africa’s Voices Foundation, Code for Africa, Local Development Research Institute, Map Kibera Trust, Muungano wa Wanavijiji, Open Institute, Twaweza East Africa, and Well Told Story – are each using technology and innovative approaches to collect and analyze citizen perspectives, feedback, and ideas in order to identify social problems, point to improvements in public programs, and spark behavior change and collective action. Participants told us they relished the opportunity to speak with these organizations and learn of new and innovative tools to collaborate with communities to collect data and source solutions, and we heard also from presenters that they appreciated the opportunity to interact with such highly engaged policymakers. Overall, this marketplace taught us that in addition to satisfying demand for peer learning among governments, a network or community of practice focused on evidence use can also be a powerful bridge between government and civil society.

pdshmx7w-jpeg.jpg

5: A thematic or sector focus could deepen the conversation on institutionalizing evidence use

We did not expect to solve huge, complex social problems in a 2.5-day conference. Rather, we were interested in exploring whether we could have a meaningful conversation about strengthening institutional practices and processes for evidence use in policy implementation, across the different policies and contexts represented in the workshop, and whether this approach would be valuable to policymakers. There was general agreement that a conversation about evidence practices and processes is critical to strengthening evidence use in policy implementation and that there are many lessons to learn from different policy areas, but workshop participants also indicated that working groups with a thematic or sector focus could help provide deeper, richer insights and value to support their work. For some, this means a focus on themes like data collection, evaluation capacity, and statistical systems, while others felt that sectoral working groups could be helpful in addressing contextual factors that are specific to sectors. As a follow-up to the workshop we will be conducting a series of short surveys to engage further on the issues or themes that would be most helpful for policymakers to engage in through a network, and we will continue to shape a network strategy in collaboration with partners.


Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” was an opportunity for a global community of committed evidence champions to share real-world experiences in implementation, discuss common challenges, and collectively shine a spotlight on good practices for using evidence to improve the implementation of policies and generate results for their populations. We heard from participants and partners that the workshop was a resounding success, giving all of us new ideas and questions to take back to our work, connections and partnerships to continue to grow, and the inspiration to continue advancing the use of evidence to get results in government. We know that here at Results for All, the workshop gave us a lot to think about, and we’ll be sharing more insights, reflections, and takeaways in our forthcoming report later this month.

jdvArR3w_edit.jpeg

 

Evidence-Informed Policymaking Reading List | July 2018 — July 3, 2018

Evidence-Informed Policymaking Reading List | July 2018

Our Network Mapping Report is Here!
 
Peer learning networks are powerful platforms for bringing countries together to exchange information, learn, and collaborate. Results for All conducted research to assess whether there is demand in government for a peer learning network that could help policymakers share experiences and amplify solutions to advance the use of evidence in policy, to achieve better outcomes for citizens. To inform a network strategy and avoid potential overlap with existing initiatives, we engaged in a comprehensive process of identifying, researching, and mapping active networks that have similar missions. The subsequent report identifies and classifies 50+ networks for peer learning and evidence use in government; describes 8 modes of engaging network members; and synthesizes research and key informant interviews into 13 lessons on network organization, engagement, and measurement. It then matches select networks against 5 criteria and concludes that a new network premised on these criteria could support evidence-informed policymaking and add value to current initiatives.

We hope this report will be useful for a variety of actors seeking to support evidence-informed policymaking, and identify opportunities to enhance collaboration and fill gaps in this important field.

Results for All Network Mapping Report


What to Read this Month

“An accurate, concise and unbiased synthesis of the available evidence is arguably one of the most valuable contributions a research community can offer decision-makers.”
The article identifies four principles that can help it make it easier for evidence producers and users to commission, appraise, share, and use evidence in policy. These four principles – inclusive, rigorous, transparent, and accessible – should apply to every evidence synthesis, which if done well becomes a global public good.

“The innovative contribution of the current study is the development and validation of a set of measurable indicators specifically devoted to assess and support EIPM in the field of public health, intended to be jointly used by governmental policy-makers and researchers, but also by other stakeholders involved in various stages of the policy-making cycle.”
The article describes a Delphi study that led to the development of indicators that can be used to assess the extent to which policies are informed by evidence. The indicators cover issues related to the skill and experience of staff working on policies, documentation of evidence in policy documents, communication and participation with key stakeholders, and procedures for monitoring and evaluating evidence use in policy. The indicators could also help to encourage establishment of routine processes for advancing evidence use in policy.

“There’s a limited evidence base about knowledge brokers, but preliminary findings suggest that they do have the potential to improve the uptake of evidence.”
Insights from the Wales Center for Public Policy on the role it plays as a knowledge broker in evidence-informed policymaking – helping to  build an understanding of evidence needs and questions, improve access to evidence, promote  interaction between evidence users and producers, and strengthen capacity to engage with research. The Center is refining its theory of change to better understand approaches that work, and to take a more systematic approach to facilitating evidence use in policymaking.

“Just as a journalist is trained to tell a compelling story so that an audience’s attention is captured and held so that facts of a story can be relayed to a reader or viewer, so too do scientists or policy experts need to capture attention and communicate both the importance and complexity of issues to their audiences.”
A useful article describing how storytelling influences the policy process and offering key steps to help policy actors build a better narrative.

“Guerrero, a Harvard-educated surgeon-turned-epidemiologist, understood violence as an epidemic transmitted from person to person. As with any epidemic, he tried to map the outbreak and understand its transmission. Data came first.”
A great story about data-driven policing and violence prevention in Colombian cities.

What We’re Working On

Later this month, we’re convening teams of government policymakers from nine countries to share experiences, challenges, and lessons on how to use different types of evidence to overcome roadblocks, create political buy-in, engage with stakeholders, and mobilize financial resources to support and improve policy implementation. “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” will take place from July 23-25in Nairobi, Kenya, in partnership with the African Institute for Development Policy (AFIDEP) and IDinsight. We’ll share profiles of the participating teams and the policies they are working on in the coming weeks on Twitter (@resultsforall) so be sure to follow us there!