We always love hearing from you! Contact us at firstname.lastname@example.org anytime.
The concept of peer learning has its roots in the classroom and can be best described as a reciprocal two-way sharing of knowledge, ideas, and experiences. Outside of a classroom context, peer learning-focused exchanges can take place through formal or informal networks of groups or individuals who have a shared history or purpose. Peer learning networks are not a new phenomenon, but recent years have seen a rise in learning-focused networks that facilitate a sharing of knowledge, tools, resources, and ideas among government policymakers working to advance national development priorities in Africa. The new crop of peer learning-focused networks are symbolic of a wide shift from expert-driven learning approaches to country- and problem-driven learning agendas. The members of these networks are typically drawn together to advance a common objective or goal, for example, promoting open government (Open Government Partnership), improving public financial management (Collaborative Africa Budget Reform Initiative), advancing universal health coverage (Joint Learning Network for Universal Health Coverage), strengthening evaluation systems (Twende Mbele), or increasing knowledge sharing in evidence-informed policymaking (Africa Evidence Network). The broad appeal of these peer networks is the learning that takes place among equals – there is an appreciation and expectation that everyone has something to share and learn – in a space that strengthens social trust and promotes tacit knowledge exchange and practical learn-by-doing approaches.
Peer learning networks and evidence-informed policymaking
In evidence-informed policymaking, decision-makers use the best available evidence to inform government policy and programmes. Evidence can be generated by research such as evaluations and rigorous studies; it can also include contextual evidence drawn from an analysis of surveys and administrative data; or experiential data that are based on feedback received from citizens. Although the way in which the key elements of the policy process are often described – typically some version of agenda setting, policy formulation, implementation, and monitoring and evaluation – suggests a rational and linear process. In reality however, policymaking unfolds as a complex and messy process involving many different actors. Importantly, beyond evidence, policymaking is influenced by the political, social, and economic context in which decisions are made, such as the openness in government, the pattern of election cycles, the level of citizen participation and the freedom of journalists.
Networks that support practical learning and the sharing of experiences are particularly suited to the uncertain, complex, and messy dimensions of the policy process, where there is no predefined one-size-fits-all solution to addressing a policy challenge. By facilitating a sharing of lessons learned, ideas and accomplishments in a space that builds trust and a deep sense of community, networks targeting decision-makers and the policy process have the potential to: 1) foster an openness to new strategies and approaches for advancing evidence use, put forth by trusted government peers who are regarded as equals; 2) deepen ownership of and commitment to evidence practices in respective government offices, engendered by belonging to a community of supportive peers who are grappling with similar challenges in integrating evidence into policy; and 3) spread, accelerate and normalise good practices for evidence use in government among members and their institutions.
Peer learning networks appear to take two complementary approaches to strengthening evidence use in government: building champions, and supporting systems change at the organisational and institutional levels. Both are needed to advance the use of evidence in policy.
The practical learn-by-doing and problem-based approaches of a peer learning network can help to build policymaker knowledge, skill, confidence and motivation. Policymakers who are confident in their ability to find, appraise and use evidence, and who understand the complexities of the policymaking process, are more likely to champion and use evidence in decision-making. Through a network’s ability to function as a platform for sharing ideas, policymakers can be exposed to new ways of thinking that encourage a shift in government culture towards greater evidence use. Network members can become advocates for evidence-informed policymaking, persuading and inspiring others to become better and more systematic at using evidence to inform the decisions that affect the lives of their citizens. For instance, with its new Africa Evidence Leadership Award, the Africa Evidence Network is doing just this; spotlighting the work of champions who are committed to strengthening the use of evidence in policy, to raise awareness about evidence-informed policymaking across the continent. Peer pressure can also serve as a positive motivational force for policymakers to become better at finding and using evidence in policy and to bring new ideas and approaches to their work.
Supporting systems change
The knowledge gained from interactions with peers in a network can inspire policymakers to introduce new government systems and platforms to support evidence use. For example, through participation in the Evidence-Informed Policy Network (EVIPNet), the Ministry of Health in Malawi launched a Knowledge Translation Platform to improve the quality and accessibility of health research and strengthen partnerships between policymakers and the research community. In Sierra Leone, the Africa Cabinet Government Network (ACGN) supported the development of a new Cabinet Manual that requires ministries across the national government to provide evidence to support policy proposals. Policymaker participation in a network can also spur policy reform such as the revision of Ghana’s national health insurance policy to align Primary Health Care and Universal Health Care, spearheaded by practitioners in Ghana who participate in the Joint Learning Network for Universal Health Coverage.
The power of peer learning networks in Africa
While peer learning networks alone cannot address the many constraints policymakers face in generating, sharing, and using evidence, they deserve a featured place in the toolbox of promising approaches for accelerating the spread of evidence practices in governments across Africa. The evidence champion that a network cultivates can play a powerful role in demonstrating and promoting awareness about the value of using evidence, and advocating for improved evidence use in policy. But as others have noted, it is hard for these champions to translate knowledge into any type of action without organisational systems and institutional leadership and guidelines to incentivise and govern the use of evidence in policy. In the nascent field of evidence-informed policymaking there is clear demand and room for peer learning networks to support both champion building and systems change in Africa. The enthusiastic and engaged participation of evidence champions from Ghana, Kenya, Malawi, Nigeria, Rwanda, South Africa, and Uganda, in a recent workshop in Nairobi, Kenya to explore evidence use in policy implementation, only serves to confirm this demand and validate the potential of peer learning networks for advancing evidence-informed policymaking in Africa.
Abeba Taddese is the Executive Director, Results for All. Contact: email@example.com
This blog was originally published in AFIDEP’s September 2018 issue of African Development Perspectives and on the AFIDEP blog here.
“When asked what helped or hindered them from applying the learning to their work, most participants with experience in government described obstacles due to the culture and leadership of the institution.”
“Societal inequality is exacerbated when one in five children worldwide do not complete upper primary school; understanding their motivations to drop out is crucial.”
Kirchuffs Atengble is the founder and Executive Director of PACKS Africa, a think tank operating from Accra, Ghana to improve the uptake of evidence in policymaking processes across the continent of Africa through information systems research and knowledge management.
For the past six years or so, I have been involved in various ways with the organisation of programmes and events in the evidence-informed policymaking (EIPM) space in Ghana (read more about EIPM here). This July, for instance, I had the opportunity to serve as rapporteur, for the third time, for the biennial international conference organized by the Consortium of Academic and Research Libraries in Ghana (CARLIGH).
One thing I’d like to highlight is the notable momentum around efforts to advance the use of research and evidence in policy in Ghana, which is also reflected across the rest of the Africa region. This is an opportune time to explore challenges and opportunities for strengthening evidence use in policy across the continent. In this blog, I explore three core issues relevant to the EIPM agenda in Ghana.
A common reference point for available evidence
A common issue is the need for clear entry points to access evidence. Libraries in Ghana have been working over the years to promote access to content from their institutions, including academic and research articles and papers. It is common knowledge within academic circles, regularly re-echoed at EIPM forums, that research produced by such institutions mostly in the form of theses and dissertations do not in any way inform policy. The Association of African Universities (AAU) has been working to improve the policy relevance of these documents, but support has been very minimal.
With respect to government data, the Ghana Open Data Initiative (GODI), launched in 2012, seeks to make data from public agencies widely available. But as it stands, the available data on the platform is limited and out-of-date. A new initiative (the e-Transform project) is now being implemented in partnership with Mobile Web Ghana to revive GODI. A fully functional GODI is a great resource for stakeholders within the sector.
The initiatives mentioned above are a few of the ongoing projects designed to promote access to evidence for decision making. But how are policymakers to find and navigate these different platforms? I suggest a common platform to consolidate information generated on a range of topics from different sources, and serve as a one-stop-shop for policymakers. Such a platform would serve as a portal, combining different access points (not replicating data collection or producing new studies), and structure access according to thematic issues for easy retrieval by policymakers, citizens, academics, development partners, and other interested stakeholders. An example of such a portal is the AGORA portal of the Research4life programme.
Cultivating knowledge brokers
Academics and researchers have a tendency to lament that their studies are not used to inform policy. Often this is because the research is not accessible to policymakers, both physically and intellectually. Policymakers have very limited time available to digest voluminous publications and need information that is presented in a user-friendly and easy-to-understand format.
Different initiatives are helping to develop the capacity of knowledge producers, enabling them to repackage resources for easy assimilation by the targeted beneficiaries. Others are focused on developing the skillsets of policymakers and their support staff to enable them to undertake such activities from the policymaking perspective.
These are all laudable initiatives. But it is important to also consider developing the skillsets of existing intermediaries, such as library professionals, communication specialists, and knowledge/evidence aggregators, who can become knowledge brokers for the sector.
To achieve efficiency and effectiveness within the ever-evolving knowledge economy of today will require specialisation. Knowledge producers should be allowed to concentrate on the core of their work, and policymakers the same. The brokering work should be left for the professionals. Here, I point to ideas like the rapid synthesis services offered by the knowledge translation and systematic review group at Makerere University. Sarah Quarmby shares her experience with the Wales Centre for Public Policy.
Business process re-engineering for public agencies and cultural change
Having described the different initiatives underway to strengthen evidence use in Ghana and offered suggestions for improving evidence practices in government, I would now like to draw attention to the institutional processes of policymaking in the country. The prevailing processes make it difficult for policymakers to engage routinely with evidence. For example, there have been instances cited that suggest that the core units involved in EIPM within Ministries in Ghana, such as the Research, Statistics, and Information Management (RSIM) and Policy Planning, Monitoring and Evaluation (PPME) directorates, have very little (if any) collaboration.
Further, some institutions are staffed by relatives and close associates of influential people in society, who may be more interested in status rather than the opportunity to improve policies or programs. This can create a problem for the entire institution, particularly if decisions are based on instinct and the use of evidence is not prioritized. This increases the risk for policies and programs to be developed with the wrong assumptions.
Wouldn’t you want to see the re-structuring of business processes, to enable the consultation of appropriate and adequate evidence from available sources during policymaking – both internally in Ministries, Departments and Agencies, and externally among sector players? It’s difficult to give a perfect example of what this could look like, but the growing collaboration between the Parliament of Uganda’s Department of Research Services (DRS) and the Ugandan National Academy of Sciences (UNAS) is a promising case to highlight.
To end on an encouraging note, it is worth mentioning that the Ghana Health Service, together with other sector players, has developed the very first Research Agenda for the health sector. This is the kind of pace-setting initiative that I would like to see more of to improve the uptake of evidence in policy processes in Ghana.
About the author: Kirchuffs Atengble is the founder and Executive Director of PACKS Africa, a think tank operating from Accra, Ghana to improve the uptake of evidence in policymaking processes across the continent of Africa through information systems research and knowledge management. Mr. Atengble was a partner of the VakaYiko consortium in Ghana, which received funding from the UK’s Department for International Development (DfID) to improve research uptake across three other countries – South Africa, Uganda and Zimbabwe.
The June 2017 M&E training. Source: FOMILENIO II
This is the second installment of a two-part blog series by Results for All describing the partnership between MCC and the Government of El Salvador. The first installment described the evidence workshop for policymakers held in July 2016 to discuss and promote the use of findings from evaluations of previous MCC-funded programs in the country, as well as the M&E trainings for public officials held the following year. This second installment highlights interviews conducted with the organizers, trainers, and participants. It considers the impact of the workshop and trainings on Salvadoran policymakers, what has changed in the last two years, and remaining challenges to evidence-informed policymaking in the country. Note that the majority of interviews consisted of written questions and responses. Quotes used here have been translated from Spanish.
“The purpose of the Evidence Workshop was to extract the evidence produced from the MCC evaluations, as well as the findings of the empirical investigations and evaluations of other donors, and to put the results into use in the processes and decisions important for El Salvador, including in policy, strategy development, or project design in the sectors of education, infrastructure, investment climate, and possibly in the security sector.” – FOMILENIO II
What Contributed to Successful Implementation?
The evidence workshop took place within two important contexts. First, MCC committed to implementing evidence workshops in partner countries with compact or threshold programs, and the El Salvador workshop acted as a pilot for this effort. Second, a 2015 reform of the executive branch in El Salvador strengthened the role of the Presidency’s Technical and Planning Secretariat, SETEPLAN, in directing the Five-Year Development Plan, organizing the National Planning System, and operating the Monitoring and Evaluation (M&E) Subsystem. To fulfill the latter mandate, SETEPLAN identified a need to strengthen the M&E skills of key personnel in government, civil society institutions, and academia. The goals of MCC and SETEPLAN thus aligned well, and their strong pre-existing relationship facilitated the planning of the 2016 evidence workshop and 2017 M&E trainings.
Workshop organizers at the MCC El Salvador Country Team, FOMILENIO II, and SETEPLAN further stressed that implementation was facilitated by “MCC’s commitment to promote rigorous evaluations and put lessons learned into practice” as well as the fact that high-level authorities from both MCC and the Government of El Salvador supported the workshop early on and attended in person.
How Did Participants Apply Learnings to Their Work?
One government stakeholder described how the Government of El Salvador identified and established M&E indicators for national planning instruments, such as the Five-Year Development Plan, as a result of the capacity building and knowledge acquired. The government also intends to use these lessons learned to achieve the Sustainable Development Goals and address multidimensional poverty. Another key stakeholder from government stated that “I believe that there have been important advances in the outcome indicators” used to measure government work.
Several of the participants who attended the 2016 evidence workshop came from academia, and highlighted in interviews how they have used the impact evaluations presented at the workshop in the university courses they teach, enriching the classes “with concrete examples.” Another participant from academia mentioned using learning from the 2017 M&E training when conducting a project evaluation for a nongovernmental entity. Another wrote that unemployment hindered her from applying the learning from the events.
Results of the evaluations of the 2017 M&E trainings found that participants especially appreciated modules on Theory of Change for policies and programs, when to use different types of evaluations (needs assessments, impact evaluations, process evaluations, and cost-effectiveness analysis), and what questions each can help answer.
When asked what helped or hindered them from applying the learning to their work, most participants with experience in government described obstacles due to the culture and leadership of the institution. One wrote that “It has been necessary to develop a culture around the evidence. The resistance in this sense is quite strong. The systems for capturing information have their limitations, which are very difficult to overcome in times when resources are scarce. In this process it has been important to count on the support of the authorities.” Another put it this way: “Considering that I have been at the front of M&E units, I can tell you that knowledge is easier to apply from an institution that gives importance to evaluation. Many times the obstacles come from the bosses, the superiors who are not interested in the issues, much less the real results and in part this is because they do not know about the subject.” One respondent noted that while the strong technical team within SETEPLAN “is receptive to MCC’s findings, the current political and fiscal climate has hindered full adoption and implementation of these lessons learned.”
What Are the Main Challenges to Using Evidence for Policymaking in El Salvador?
Regarding challenges to evidence use in general, participants spoke overwhelmingly about the need to create an evaluation and learning culture within their institutions. “It is necessary to create the culture of M&E in our professionals […] It is necessary to stop seeing evaluation as a way to measure and punish the employee,” wrote one participant. Similarly, another interviewee explained how “evaluation can be seen as an audit and not as a learning and improvement process.” Another stated that “The information systems that are available do not respond to current needs and the evolution of the programs. Nor is there a culture of evaluation and many of the processes are susceptible to human errors due to lack of systematization.” The frequent turnover of governments and personnel exacerbates efforts to create an evidence culture, and necessitates “continuously sensitizing decision-makers and managers of public policies, programs and projects on the importance of the generation and use of evidence,” according to another stakeholder from government.
In addition to the lack of a culture of using evidence and evaluations, access to relevant evaluations or updated statistics was a commonly cited challenge, especially for participants from outside government. Other challenges cited include:
There may also be an assumption that evaluations are always costly and take a long time. One interviewee explained that policymakers often work with consultants who over-charge for evaluations of sub-par quality. Lastly, one interviewee noted that with El Salvador’s current financial conditions, the government may have knowledge of the highest-impact interventions, yet lack the funds to implement them.
What is the Attitude Toward Evidence Use in El Salvador, and What Has Changed in the Last Two Years?
On the whole, workshop organizers and participants stressed that the mood has shifted in El Salvador, and that while room for improvement remains, significant advances have been made in the last two years to institutionalize evidence use, and to view evidence and evaluations as positive and essential tools for government. One participant described a shift in measuring impact, from purely qualitative to quantitative evaluations using the counterfactual. Another wrote that “there are mechanisms already institutionalized to publicize the results that are available. A lot of attention and resources have also been given to evaluation to be able to continue improving interventions.” Several others highlighted greater interest in or commitment by public institutions to measure results and open themselves to citizen participation. The fact that the M&E trainings had participation from top-level ministerial staff all the way down to program staff is a clear sign of the government’s commitment to use evidence in a more systematic way, declared one stakeholder.
Several participants spoke of remaining challenges, pointing mainly to limited resources for conducting evaluations, and the political environment. As one stakeholder wrote, “The technical offices within the government are receptive toward the use of evidence-based decision making, but the electoral environment and political leadership of both sides of the aisle favors decision making using a different calculus.” Another put it more frankly, “the bosses do not like to know that things go wrong and that they should be adjusted.” However, just as a different interviewee observed, the fear of admitting that things have gone wrong is not unique to El Salvador.
Onwards and Upwards: Catalyzing Continued Culture Change for Evidence Use
Overall, the organizers and participants of the 2016 evidence workshop and 2017 M&E trainings spoke positively of the events and the relationship between MCC and the Government of El Salvador that made them possible. Most participants noted that they were able to apply their learning in some capacity in their work, though how they are doing so differed widely, from better identifying indicators for national planning, to using concrete examples of evaluations in university teaching. Organizers and participants had similar responses when asked about challenges to evidence-informed policymaking in El Salvador: nearly every interviewee emphasized, at some point, the importance of developing institutional cultures of evidence use and learning. That includes by sensitizing government authorities, who participants explained can present obstacles when they lack familiarity with evidence methodologies or topical issues, when they see evaluation as primarily an audit and not a learning tool, when they are reluctant to see negative results, or when they devalue evidence compared to political or electoral priorities.
At Results for All, we have studied many other training programs and initiatives that aim to build policymaker knowledge, skill, and motivation to use evidence in government. The challenges cited by stakeholders interviewed for this blog resonate with our research – namely that institutional cultures, political leadership, and staff turnover can impede the effectiveness of isolated training programs or events that focus on individual government personnel. Instead, we find that creating cultures of evidence use, and institutionalizing evidence-informed policymaking in government, requires the right incentives.
In the United States, our colleagues at Results for America help incentivize political leaders to become evidence champions by highlighting their work in the media (policymakers love press) and recruiting them to join peer learning and advocacy networks. At Results for All, we recently hosted a peer learning workshop for teams of government policymakers from nine countries, focused on using evidence to improve policy implementation. We saw how the opportunity to share and learn from their peers motivated teams of government policymakers from around the world to apply for the workshop. We found that policymakers really want to learn about and create guidelines, policies, and frameworks that can incentivize, systematize, and govern evidence use in their institutions. Lastly, we witnessed strong demand for a sustained network of evidence champions, one that could provide a powerful global platform to advocate for and incentivize cultures of evidence-informed policymaking and learning in government.
Over the next few months, we will publish a series of briefs and case studies on mechanisms to incentivize evidence use in government. We are also shaping a strategy for a global peer learning network on evidence-informed policymaking. The network has the potential to unite leaders from governments like El Salvador to highlight, disperse, and deepen existing evidence practices, pair governments grappling with similar challenges, and jointly develop guidelines and tools to govern evidence use and incentivize continued progress.
Ari Gandolfo is Projects and Partnerships Manager at Results for All, a global initiative dedicated to helping policymakers use data and evidence to improve the lives of citizens. We are committed to highlighting and accelerating the spread of good practices for evidence use in government, developing tools and resources to support policymakers, and building innovative partnerships and networks to recognize and promote peer learning among evidence champions in government.
Do you have comments, questions, or ideas for us? Do you have experience with incentives in the public sector that could help to strengthen and make evidence use in policymaking routine? You can reach us at firstname.lastname@example.org.
The 2016 evidence workshop. Source: FOMILENIO II
Thanks to a commitment to monitor and evaluate the impact of its investments, and a focus on using data and evidence to get results, the Millennium Challenge Corporation (MCC) generates a lot of evidence on “what works” to improve economic growth and development in its partner countries in Africa, Asia, Eastern Europe, and Latin America. However, do partner country governments use the evidence generated by MCC-funded programs to inform their policies and planning? Or are those valuable lessons on “what works” forgotten as MCC and its partners move on to new priorities? In July 2016, MCC held a workshop in El Salvador intended to make better use of the evidence generated by its own investments in the country, and followed up with a Monitoring and Evaluation training for policymakers one year later. How did participants apply the evidence and learning to their work, and what helped or hindered them from doing so? What has changed in the two years since the initial workshop, and what challenges remain for using evidence in government policy?
At Results for All, we are committed to shining a spotlight on good practice initiatives that promote the systematic use of evidence in government decision-making. We have studied other evidence use training programs, such as this one in Ghana, and interrogated to what extent individual policymakers are able to apply what they learn to their work, noting the challenge of creating lasting impact with these programs. We wanted to help MCC tell the story of their work in El Salvador, see what lessons from this experience resonate with our previous research, and identify what insights could be shared to advance evidence-informed policymaking in other contexts. In this two-part blog series, we describe the programs held in El Salvador in 2016 and 2017, follow up with Salvadoran policymakers and partners to consider the impact of the activities, and discuss implications for furthering evidence-informed policymaking in El Salvador and other countries.
Development Assistance Grounded in Partnership: The MCC Model in El Salvador
The Millennium Challenge Corporation (MCC) is a foreign aid agency of the U.S. government, created by Congress in 2004, which supports economic growth, poverty reduction, and institutional strengthening in developing countries with good governance. By focusing on selective countries with good policies, and emphasizing country ownership, evidence-based programs, and rigorous monitoring and evaluation, the MCC model helps to ensure that U.S. dollars are well spent to get cost-effective results.
How it works: MCC provides time-limited grants to competitively selected partner countries. To be eligible for the five-year “compacts,” countries must pass the MCC Scorecard, a set of independent (non-MCC) indicators related to good governance, economic freedom, and investing in citizens. Countries close to passing the eligibility criteria can receive small “threshold” grants to support policy and institutional reform, to help improve their policy performance in key areas and work towards future collaboration with MCC. Once selected, partner country governments work with MCC to design programs that align with national development priorities, and refine plans through consultations with civil society and the private sector. Countries must then establish a Millennium Challenge Account (MCA), a local accountable government entity responsible for implementing the grant funds, which are also subject to rigorous and independent monitoring and evaluation to assess the impacts of the MCC-funded programs.
MCC in El Salvador: MCC has awarded two five-year compacts to El Salvador. The first, from 2007 to 2012, invested $461 million in education, public services, agricultural production, rural business development, and transportation infrastructure. The second compact, active from 2015 to 2020, will invest up to $277 million (with an additional $88 million contributed by the Government of El Salvador) in regulatory reforms, education, and logistical infrastructure to increase El Salvador’s productivity and competitiveness in international markets, in order to promote economic growth and private investment in the country.
The compacts are implemented by Fondo del Milenio, commonly referred to as FOMILENIO, the Millennium Challenge Account (MCA) or entity responsible for the MCC investments in El Salvador. For the second compact, FOMILENIO II is led by the Presidency’s Technical and Planning Secretariat (SETEPLAN, for its Spanish acronym) and includes representatives from federal ministries, civil society and academia, the private sector, subnational governments, and MCC.
A Bigger Bang for the Buck: A Workshop to Make Better Use of Evidence Already Generated
Early in the second compact, MCC and FOMILENIO II hosted a workshop (Spanish) entitled “Closing the Gap: Strengthening the Ties between Evaluation and Policy,” which aimed to promote the use of evidence generated through MCC investments in the design and implementation of new government programs and policies. In essence, the workshop sought to take stock of the knowledge acquired through previous MCC-funded work in El Salvador, and help Salvadoran policymakers identify where it could be applied to make further progress on government priorities.
Over 180 policymakers, practitioners, and researchers attended the event, held in San Salvador on July 28, 2016. Plenary sessions discussed MCC and FOMILENIO I achievements and the importance of using impact evaluations for policy, planning, and budgeting purposes. Representatives from MCC, the World Bank, Inter-American Development Bank, Abdul Latif Jameel Poverty Action Lab (J-PAL), USAID, Mathematica Policy Research, and other partners then presented the results of the evaluations of MCC-funded projects to improve education and El Salvador’s investment climate. Participants discussed the findings and ways to incorporate that knowledge into new programs and policies.
“As part of the workshop, participants committed to use the lessons learned to improve education, gender, and legal and regulatory policy to make the business climate more competitive and help ensure that better educated students can find higher paying jobs in El Salvador.” (MCC’s Statistically Speaking Newsletter, January 2017)
The evidence workshop provides a good example of collaboration between MCC and an in-country Millennium Challenge Account and offers a relatively simple strategy to promote evidence-informed policymaking – and a continued return on MCC investments – by jointly brainstorming how to make better use of evidence already generated by MCC-funded programs.
Findings from MCC’s First Compact with El Salvador: Supporting Technical Education
At the July evidence workshop, MCC and FOMILENIO partners presented final evaluation findings from already concluded MCC-funded programs. For example, Mathematica Policy Research presented findings from the evaluation of the Formal Technical Education Sub-activity. The sub-activity included a scholarship program meant to increase graduation rates from post-secondary technical-vocational schools. The results? Boys who were given scholarships to the technical-vocational schools were more likely to graduate compared to boys who did not receive scholarships. Their graduation rates, 80% vs 63%, differed by a statistically significant 27%, suggesting that scholarships are an effective means to increase boys’ graduation rates. However, the difference for girls, 77% vs 76%, was not significant. Why the variation? We can hypothesize that scholarships may serve as a stronger motivator for boys than girls because they reduce boys’ incentives to emigrate or find low-skilled work to provide for their families, which is common in El Salvador.
Importantly, the evaluation also found that graduation from a technical-vocational school did not guarantee higher income, sometimes because the skills taught by the program did not match those needed by local employers. Early collaboration with the private sector in the design of technical curriculum is therefore essential.
Enabling Evidence Use: Tiered M&E Training for Public Officials
After the initial evidence workshop, the Government of El Salvador requested a certificate course for relevant ministries and implementing entities, to improve the M&E capacity in government. In response, MCC and FOMILENIO II worked with J-PAL to deliver three sets of trainings (Spanish) from June to August 2017. The trainings aimed to show how impact evaluation findings and other evidence can be used to improve public policies and results for citizens, identify challenges to evidence-informed policymaking and strategies to overcome them, and enhance relevant skills in monitoring and evaluation. The first targeted high-level public officials, principally ministers and directors, while the second brought in mid-level public officials responsible for project design and implementation and included more technical information. The third training was designed for M&E specialists and technical staff, with more emphasis on statistics and econometrics and other skills needed to conduct, coordinate, or supervise the evaluation of public policies and projects.
In addition to enhancing the skills of participants, the training was meant to create a cadre of professionals that could form a potential Monitoring and Evaluation Committee supported by the Presidency’s Technical and Planning Secretariat, a plan that could help to institutionalize the use of evidence in policy and planning decisions.
Beyond Sharing Evidence and Building Skills: Assessing Changes Two Year Later
This blog introduced the partnership between MCC and the Government of El Salvador, and described the evidence workshop held in July 2016 and follow-on trainings held one year later. How did policymakers in El Salvador apply what they learned in the workshop and training to their work in government? What helped or hindered them from doing so? How have their attitudes toward evidence and their motivation to use it changed in the last two years, and why? To answer these questions, the second blog in this series features interviews with policymakers who organized and participated in the workshop and trainings, and discusses what remaining challenges impede policymakers from using evidence in El Salvador, and how MCC and other donors can help.
Ari Gandolfo is Projects and Partnerships Manager at Results for All, a global initiative dedicated to helping policymakers use data and evidence to improve the lives of citizens. We are committed to highlighting and accelerating the spread of good practices for evidence use in government, developing tools and resources to support policymakers, and building innovative partnerships and networks to recognize and promote peer learning among evidence champions in government. You can find other training programs and evidence-informed policymaking initiatives we have studied here. In particular, you may want to look at this case study on evidence use training for civil servants in Ghana, or the many other programs we summarize in this table that aim to build policymaker knowledge, skill, and motivation to use evidence in government.
Just Concluded: Peer Learning Workshop on Evidence Use
Last month we hosted a workshop for teams of government policymakers from nine countries, providing a peer learning forum to share experiences, lessons, and strategies for using evidence more effectively in the implementation of social policies. You can learn about the workshop here and read our initial insights and reflections in our latest blog post.
They come together in a workshop where they can share experiences, lessons, and strategies for using evidence more effectively in the implementation of social policies.
On July 23-25, Results for All and our partners at AFIDEP and IDinsight hosted “Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” in Nairobi, Kenya, providing such a space for ten teams of evidence champions and policymakers from nine countries: Chile, Ghana, Kenya, Malawi, Mexico, Nigeria, Rwanda, South Africa, and Uganda. Each government team is tasked with implementing a specific social policy, such as increasing the quality of public education, meeting family planning targets, and supporting the most vulnerable households, and sought to use this opportunity to inform that work. To learn more, take a look at the workshop agenda, and read about the participating teams and the policies they are working on in this set of short policy briefs we wrote together. You can also see photos from the workshop and several video interviews with participants on our Twitter.
Over the last few months, we’ve been engaged in a series of consultations to assess the demand for a global evidence network and understand how funders are supporting evidence use to inform government priorities. A consistent theme in our conversations has been the lack of attention given to policy implementation or translation of policy to action. We also heard from many participants that the specific focus on evidence use in policy implementation is what drew them to apply for and participate in the workshop.
Policy implementation challenges can occur due to a myriad of factors, including unclear policy goals and outcomes; an absence of political support or financial resources; missing or weak evidence on the effectiveness of an intervention; inadequate skills or motivation among public officials tasked with frontline service delivery; and incorrect assumptions about human behavior and local needs. Addressing these implementation challenges requires a variety of evidence: evidence on how to mobilize political and financial support for the policy; evidence on whether the policy has worked elsewhere and under what conditions; evidence on how to enable and incentivize frontline agents to best implement and track the policy; and evidence from local stakeholders to best tailor the policy to their context and needs.
When implementation, along with monitoring and evaluation activities, is not linked to policy design but instead treated as a distinct down-stream activity, the incentive to produce evidence in an ongoing and iterative process to inform policy is weak. This puts evidence-informed policymaking at risk. Policymakers can only ensure the benefits of evidence-informed policymaking when implementation succeeds. We think it is critical therefore, for governments to take a systematic and structured approach to using evidence to bridge the gap between policy design and implementation, to achieve better results for the people they represent and serve.
We’re still processing our learnings from the workshop and will share a lot more in the weeks to come, but here are some initial takeaways.
1: Common evidence use challenges persist across diverse contexts
We were not surprised to learn of the many common challenges that workshop participants face in using evidence to inform policy implementation, regardless of the specific policy, sector, or country context. These include the lack of a learning and results-oriented evaluation culture; the difficulties associated with integrating and using data across the multitude of agencies working to address complex social problems; the challenge of turning raw data into useable information; the absence of structured partnerships with the research community and media; and a lack of tools and understanding on not only how to engage with citizens, but importantly on how to use the inputs that they provide to improve policy implementation.
2: A safe space for sharing challenges, experiences, and accomplishments is attractive to policymakers, even those with more advanced evidence use
In our consultations we also heard a lot of interest in peer learning and networking between governments, and we received a lot of interest in this workshop due, we think, to the fact that peer learning was a central theme. We wanted to test whether policymakers from a diverse set of countries, most of whom had never met before, could connect over common missions and challenges, openly share their successes and failures, and provide real value to each other’s work. The verdict thus far is yes: participants have told us over and over what a great opportunity it was to connect with others in government, and to now have a network of international peers with whom they can discuss ideas and share resources. We confess that we were skeptical that representatives from some countries – especially Mexico, Chile, and South Africa, with their very advanced evaluation systems – would benefit from being in the room as much as others – but participants have reiterated that the workshop gave them a lot to think about and apply to their work. Some noted that this was a great opportunity to showcase their country’s learning and growth around evidence use, and that they were interested in more forums that provided this platform.
3: Participants want more practical, hands-on tools with immediate applications to their work
Workshop participants were especially keen to use tools, like checklists, behavioral insights tools, and design thinking, that helped them reflect on their own experiences, map out connections, and chart a way forward. This interest signals the potential for future network activities focused on jointly developing model policies, frameworks, or guidelines for evidence production and use, which participants could then adapt to their own contexts.
4: Connecting government and civil society is a valuable function for a network or community of practice
We designed this workshop as a forum primarily for government policymakers to learn from each other. However, one of the most popular sessions was a ‘marketplace of citizen engagement solutions,’ where we invited eight non-governmental organizations from Nairobi to set up booths and showcase their work to participants. The organizations – Africa’s Voices Foundation, Code for Africa, Local Development Research Institute, Map Kibera Trust, Muungano wa Wanavijiji, Open Institute, Twaweza East Africa, and Well Told Story – are each using technology and innovative approaches to collect and analyze citizen perspectives, feedback, and ideas in order to identify social problems, point to improvements in public programs, and spark behavior change and collective action. Participants told us they relished the opportunity to speak with these organizations and learn of new and innovative tools to collaborate with communities to collect data and source solutions, and we heard also from presenters that they appreciated the opportunity to interact with such highly engaged policymakers. Overall, this marketplace taught us that in addition to satisfying demand for peer learning among governments, a network or community of practice focused on evidence use can also be a powerful bridge between government and civil society.
5: A thematic or sector focus could deepen the conversation on institutionalizing evidence use
We did not expect to solve huge, complex social problems in a 2.5-day conference. Rather, we were interested in exploring whether we could have a meaningful conversation about strengthening institutional practices and processes for evidence use in policy implementation, across the different policies and contexts represented in the workshop, and whether this approach would be valuable to policymakers. There was general agreement that a conversation about evidence practices and processes is critical to strengthening evidence use in policy implementation and that there are many lessons to learn from different policy areas, but workshop participants also indicated that working groups with a thematic or sector focus could help provide deeper, richer insights and value to support their work. For some, this means a focus on themes like data collection, evaluation capacity, and statistical systems, while others felt that sectoral working groups could be helpful in addressing contextual factors that are specific to sectors. As a follow-up to the workshop we will be conducting a series of short surveys to engage further on the issues or themes that would be most helpful for policymakers to engage in through a network, and we will continue to shape a network strategy in collaboration with partners.
“Using Evidence to Improve Policy Implementation: A Peer Learning Workshop for Government Policymakers” was an opportunity for a global community of committed evidence champions to share real-world experiences in implementation, discuss common challenges, and collectively shine a spotlight on good practices for using evidence to improve the implementation of policies and generate results for their populations. We heard from participants and partners that the workshop was a resounding success, giving all of us new ideas and questions to take back to our work, connections and partnerships to continue to grow, and the inspiration to continue advancing the use of evidence to get results in government. We know that here at Results for All, the workshop gave us a lot to think about, and we’ll be sharing more insights, reflections, and takeaways in our forthcoming report later this month.