Adaptive Management for Education in Unstable Settings: What will USAID’s New Guidance Mean for Education Projects?

Following up our first blog on adaptive management and emergent theories of change for Goal 3 activities, we are highlighting  Melissa Patsalides’ (Acting Deputy Director for USAID’s Office of Learning, Evaluation and Research) recent update on innovative policy and guidance developments in the Agency.

“Adaptive management is something many USAID staff and implementing partners have intuitively felt the need for, and sometimes implemented in spite of apparent bureaucratic constraints, for several years now.

The Agency is increasingly working in countries that are unstable and in transition. And even in more stable environments, we cannot always reliably predict how events or circumstances will evolve and impact our programs. As a result, USAID’s traditional management approach, which assumes that we can foresee, with some certainty, how a country or sector will change over time, is inadequate.

We are giving much thought to how adaptive management can support the Agency’s work to achieve more effective development results. It will require strategic plans and project designs as well as procurement processes and budgets that facilitate adaptability. All of this is needed to enable us to respond to new and changing circumstances to get the best results.

A central focus of the upcoming revised ADS guidance on the Program Cycle is adaptive management. Through PPL’s extensive engagement across the Agency to identify how we should change the policy, staff shared issues that hampered their ability to incorporate adaptive management, and discussed ways to address them in the new guidance.”

Melissa identifies key institutional issues that are being addressed in new Agency guidance including procurement mechanisms, engaging with contracting officers, use of monitoring and evaluation to prioritize learning and adaptation, rather than accountability.

USAID ECCN is working to provide resources, guidance and tools to support adaptive management and an emergent theory of change for education in crisis and conflict environments, and we seek your collaboration in developing and sharing these resources over the coming months.

Melissa challenged USAID staff and implementing partners with the following question:

What have you learned from experiences with adaptive management that could benefit others at USAID and our partnerships as we increasingly use that approach?

What is your response to this innovative guidance?  PLEASE share your own ideas and related links, research, cases and evaluations that exemplify how adaptive management and an emergent theory of change can work for education programming in crisis and conflict.

A wonderful and joyful holiday season to our USAID-ECCN colleagues.  May you enjoy these innovative nuggets from recent developments at USAID.

Posted in Design, Implement, Research and Evaluate for Better EiCC Programs, Theories of Change
3 comments on “Adaptive Management for Education in Unstable Settings: What will USAID’s New Guidance Mean for Education Projects?
  1. Profile photo of Ash Hartwell Ash Hartwell says:

    From Christopher Maclay and Mercy Corps:

    Response to Melissa’s intro blog
    We’re really excited to hear about this initiative, and fully support the PPL in efforts to explore how USAID can better integrate an adaptive management approach into its work.

    To start with, I would like to re-quote two fantastically important sentences from Melissa’s blog post: “In past years, the Agency has focused on monitoring performance primarily for accountability purposes – to the Congress and other stakeholders. This is important but can eclipse the power of collecting information that informs effective implementation.” At Mercy Corps, we’ve been trying to make sense of the widely differing objectives of M&E, and recognize that there seems to a spectrum between using M&E to ‘Prove’ an intervention, versus to ‘Improve’ the intervention. While M&E certainly started out as a method to monitor, evaluate, and improve our interventions, as Melissa outlined above over the past few decades it has almost certainly become a method to prove that we did what we did, and the impact it achieved. A noble task in itself – it provides ‘accountability’ that activities and outputs occurred, may minimize risk, and builds an evidence base of best practices – but what it of course misses out on is the feedback loop to improve implementation; ‘implementation’ becomes separated from ‘M&E’ or ‘Learning’.

    Mercy Corps has recently been working to see how we can best build those feedback loops back in to M&E, and through the way we structure teams and programs. Our recent guidance on adaptive management recognizes a few key pillars to this: Culture, People, Tools, and Enabling Environment (see the full paper here). The Enabling Environment component in particular recognizes the role of the donor. While M&E tools exist in themselves, the tools we deploy, and the way that we deploy them, are often based on donor requirements. With M&E used only to ‘prove’ what happened, the tools deployed are unlikely to provide information in the way that is needed to inform quick programmatic iterations to ‘improve’ delivery. Notably, in complex contexts with consistently shifting landscapes, we cannot foresee all futures and thus come up with perfect plans in advance, so methods to ‘prove’ what we said we were going to do, are unlikely to be suitable. The problem has likely changed, as has the necessary response. What we need is an M&E system which allows us to ‘improve’ and respond to that context.

    Mercy Corps’ recent experience in Liberia on the OFDA-funded Ebola Community Action Platform (ECAP) is a good example of an M&E system built to improve delivery. OFDA did not require extremely detailed activity reporting, but focused on high-level outputs, providing some flexibility at the point of delivery. This meant that we could use the M&E system to provide information on knowledge, attitudes, and practices from across the country (specifically, almost 12,000 surveys per month) allowing for iterations on health messaging and methods of message delivery. This data also gave OFDA the sense that we knew what we’re doing, so we could justify adaptations based on reasonable evidence.

    To provide a conducive ‘enabling environment’ is not necessarily easy though. Many implementers are used to reporting to donors using a narrow range of tools which emphasize counting outputs and limit flexibility to change strategies. They may be uncomfortable proposing alternative M&E models unless explicitly invited to do so. Donors can be – and may indeed need to be – proactive in encouraging this. That’s why we welcome this blog post so much, and the efforts to explore adaptive management through the ECCN. This approach is a big shift in terms of the typical USAID approach. While it should be lauded, implementers are unlikely to be fully equipped to take it on without proactive messaging, guidance, and collaborative communication between donors and implementers. Let’s be honest: implementers can be scared of donors, and in order to encourage uptake of such approaches, we need to be reminded that it’s okay.

  2. Medugu Sylvanus Stephen says:

    Adaptive management will provide unique platform for cultural appropriateness.

    • Ash Hartwell says:

      Thanks for your comment and observation Stephen – a key element of adaptive management is providing a support system for listening to and acting on the explicit and tacit voices and information from those who are providing the front line services (e.g. teachers) and the learners on a regular basis. This should make the cultural perspectives, relationships, values and practices an integral part of the ‘project’ or program.
      This is well described in a chapter by Nan Wehipeihana and others, ‘Cultural Responsveness through Development Evaluation: Indigenous Innovations in Sport and Traditional Maori Recreation.’ from MQ Patton’s latest book (2016) on ‘Developmental Evaluation Exemplars” Pinciples in Practice’.
      Do you have examples of this practice from your experience?

Leave a Comment