Learning & Evaluation
Get the latest evaluation news!
The Peacebuilding Evaluation Consortium is co-hosting it’s first Peacebuilding M&E Solutions Forum on Tuesday, 23 October 2018 with the United States Institute of Peace.
We are now accepting proposals for the Peacebuilding M&E Solutions Forum through July 6, 2018!
Appropriate Monitoring and Evaluation for Peacebuilding
Peacebuilders operate in countries where the root causes of violence, poverty, inequality, human rights abuses, and weak institutions are intricately linked. As a result, peacebuilding programs have become increasingly complex and multifaceted. With the additional life-and-death realities characteristic of conflict settings, measuring the results and impact of peacebuilding programs poses unique challenges. Consequently, peacebuilding evaluation requires appropriate and specialized methodologies that can both capture long-term and multipart change processes and the effectiveness of those programs while being sensitive to ongoing and underlying conflict dynamics.
In the past twenty years, the peacebuilding field has made significant strides in addressing the technical challenges of peacebuilding evaluation. However, much of this progress has been made in individual, isolated organizations and, as a result, the field continues to face substantial obstacles to answering the question:
What brings about lasting and sustainable peace?
Better Evaluation for Better Peacebuilding
In 2010, the Alliance for Peacebuilding and the US Institute of Peace (USIP) recognized the peacebuilding field would continue to struggle to answer this question unless it collectively addressed the counterproductive monitoring, evaluation and learning practices and incentives that plagued it. Through a twelve-month process, called the Peacebuilding Evaluation Project: A Forum for Donors and Implementers, AfP and USIP began to convene donors, policymakers, evaluation experts, researchers and peacebuilding practitioners around the unique yet shared challenges of measuring, monitoring, evaluating, and learning in a peacebuilding context.
As the first forum of its kind, the Peacebuilding Evaluation Project provided a neutral and safe space for donors and implementers to honestly discuss the political and technical issues of evaluation. The rich discussions and lessons from these meetings were captured in a USIP special report, Improving Peacebuilding Evaluation: A Whole-of-Field Approach and an AfP publication Starting on the Same Page: A Lessons Report.
Less than a year later, the two organizations innovated again by holding the first peacebuilding evaluation evidence summit. Made possible by the Carnegie Corporation of New York, the summit was built on analyzing nine peacebuilding evaluation frameworks. Panelists included representatives from U.S. Agency for International Development, the U.S. State Department, the World Bank, the United Nations, the International Development Research Centre, and other major nongovernmental organizations, foundations, and evaluation consultancies from North America, the Middle East, Europe, Asia, and Africa
Read the first Peacebuilding Evaluation Evidence Summit report
To continue these field-wide solutions, the Alliance for Peacebuilding, CDA Collaborative Learning, Mercy Corps, Search for Common Ground, and USIP created the Peacebuilding Evaluation Consortium. A whole-of-community collaboration to develop innovative, cost-effective, and applicable methodologies that address the complex reality of peacebuilding evaluation, share learning, and foster collaboration, the Peacebuilding Evaluation Consortium is creating a new infrastructure for peacebuilding evaluation.
|DME for Peace: – Join the global community of more than 4,000 practitioners, evaluators and academics on how to design, monitor, and evaluate peacebuilding programs. Join the conversation on the popular Thursday Talks, bi-monthly discussions among funders, peacebuilding practitioners, and evaluation experts on important issues within peacebuilding evaluation.|
Learn more about:
For more information or to be involved with AfP’s evaluation work, contact Jessica Baumgardner-Zuzik, firstname.lastname@example.org.