Evaluation & Adaptive Learning

Results for Development works with health, education and nutrition practitioners to apply a range of monitoring, evaluation, research and learning methods, and generate timely evidence to help programs become more effective and achieve greater impact at scale.

The Challenge

Funders, policymakers and program managers are constantly looking for ways to design, evaluate and replicate successful development programs. They’re looking for answers about what works — and, even more importantly, what doesn’t, why and how — to inform policy debate, program design and program implementation.

But this requires a commitment to learning, and learning is not as simple as it sounds. Many programs strive to learn from their daily implementation experiences, but they often do so in an ad hoc manner, making it hard to systematically translate those learnings into program design or global knowledge. Programs that use traditional monitoring and evaluation (M&E) approaches may generate invaluable information on whether a program worked or not, but this information may not be available until the end of a program cycle — and with so many variables, it may be unclear why the program worked and if it can be replicated in other contexts. This leaves little opportunity to execute on the findings during design and implementation, or to answer tough development questions.

Our Approach

R4D acts as a strategic learning partner for programs by helping them: design rapid interventions, measure their progress, analyze results and synthesize learnings, and ultimately, recommend ways to redesign elements of the program based on the evidence gathered. We support program needs with monitoring, evaluation, and structured learning opportunities, and place experimentation and rapid feedback cycles at the heart of program design, pilot phases and scale up efforts. Each engagement is tailored to meet the needs of individual programs, involving some or all of the following stages:

  • Understand problems: Understanding performance challenges and defining impact goals.
  • Identify solutions: Reviewing existing evidence, researching potential solutions and defining a defensible theory of change for the program or intervention.
  • Design and experiment: Prototyping, piloting and refining solutions based on ongoing and regular qualitative and quantitative feedback.
  • Incorporate learnings: Presenting experiment findings and their implications for program refinement and redesign.

Our engagements involve regular Learning Checks, where we come together with our partners to review findings from the data collected. This gives them the opportunity to reflect on the learning activities, to brainstorm and plan for how the findings can support the refinement of future implementation, and to iterate accordingly. There may also be opportunities to share findings with peer programs.

R4D’s approach is unique and grounded in practical implementation experience:

  • We prioritize learning questions where there is the greatest chance for impact.
  • We use the most appropriate methods for the stage and needs of the activity.
  • We focus on making data actionable.
  • We combine analytical and research skills with practical implementation experience.

 

Photo © Rising Academy Network

R4D Digital Communities

We create and support global communities of innovators, funders and policymakers for continuous and iterative learning, knowledge generation, exchange and collaboration. Explore some of our communities: