Blog Home

Three tough education challenges we’re all grappling with

Molly Jamieson Eberhardt   |   April 6, 2018   |   5 Comments

What’s on my mind after a week at the CIES Annual Conference

The three things I find myself worrying about after the intellectual blitz of CIES had already been on my mind for quite a while. But a week with colleagues from around the globe confirmed that these sticky, complex challenges are facing us all — and that we’re collectively charging ahead and looking for solutions, despite the strain.

First, we worry the evidence and tools we’re generating aren’t getting into the hands of the right people, in the right way, to have real impact.

The amount of time and money we spend generating evidence dwarfs the resources we allocate to making sure it’s useful. How authentic is the demand for the evidence we generate? What information exactly is needed to inform the decisions of the people who will use it? To improve results? When do they need that information, and in what format? This is about more than just thoughtful dissemination. This is about generating evidence that is needed, that has an audience who is willing to use it, and that is produced at a time and in a fashion that will help that audience make decisions. Of course, supply can sometimes create demand, so we shouldn’t just sit around waiting to be asked to generate evidence. But before we start collecting data, we should have a pretty good sense of who might be able to benefit from it — and we should at least have had a conversation with them about what will make it most useful. It helps when funders are willing to dedicate significant resources to pre-, during- and post-research efforts to engage users of evidence. It was clear at CIES that post-research consultations (dissemination of findings and recommendations) are very common, consultations both before and after evidence is generated are less common but increasingly the norm, and least common of all is ongoing consultation and collaboration throughout the evidence generation process. The latter is what we need to be aiming for.

For example, our team is working with the International Step by Step Association (ISSA) to interview stakeholders in 15 countries before beginning the process of developing a tool to help policymakers understand the status of and better support the early childhood workforce in their countries. This is a good start, but we should hold ourselves accountable to this kind of consultation not just before and after, but throughout our evidence generation process—we and others are up to the challenge, but it will take significant additional time and effort, and we all need to get more comfortable with that trade-off.

Second, when we develop measurement tools, we struggle to balance practicality with validity.

Learning assessments, disability screening tools, classroom and school checklists: we know these tools need to be valid and reliable, but when they are being used to inform instruction and school management, they also need to be practical for teachers, schools and their supervisors to implement. How can we strike this balance without giving up on formative assessment or rigor? The well-intentioned quest for fit-for-purpose measurement tools has also resulted in a dichotomy between practical tools without proven validity and tools only useful at small-scale and by trained enumerators. If there were an easy answer to this one, we’d have thought of it by now. Nonetheless, many of us are finding ways to assess whether we can maintain rigor with practical tools (see a study we did in Kenya in 2016 with Uwezo and the Australian Center for Educational Research to compare the validity of Uwezo’s short, easy-to-administer reading test with a more exhaustive assessment of proven validity).

Third, we want to work more closely with government officials to achieve the scale and sustainability we know is needed, but political realities can be challenging.

I heard from dozens of people at CIES about their partnerships with governments at the national and sub-national levels. There is more innovation than ever being fostered within government education systems. But the fragile nature of politics and bureaucracy remains anxiety-producing. Will the commitment of this minister outlast the individual’s term? Will the reform we’re helping to develop be implemented? Will the resources actually be allocated when the time comes to scale up? When sustained progress is so desperately needed, this uncertainty can be hard to accept. It’s not that working outside of the government system is straightforward and predictable, but the challenges can be different. Government officials are working to deliver results within a complex constellation of players and variables — and our work with them is typically only one small star. Nevertheless, I hear general agreement that the risks of partnering with governments are outweighed by the potential benefits, and good advice rang through the halls of CIES — encouraging us to prioritize political economy analysis, understand from partners where there are windows of political opportunity, and engage with officials at multiple levels to ensure sustainability.

The good news is none of these challenges seems to be slowing us down. But as we charge ahead, let’s continue to talk about how we’re grappling with these issues. I hope you’ll share your comments below.

Comments 5 Responses

  1. Palwasha May 14, 2018 @ 12:46 am

    Great piece and advice. Allow me to say that the situation gets even more dire in the case of fragile states in conflict. Both the government and DPs are constantly adjusting and balancing service delivery and quality improvement in an environment beyond their control and restricted with challenges at all levels.

    Reply
  2. Marco May 10, 2018 @ 11:06 am

    Hi Molly,

    The link to the 2016 Kenyan study seems to be broken. Is there any other way I can get access to it?

    Best,

    Reply
    1. Kelly Toves May 22, 2018 @ 6:12 pm

      Hi Marco,

      The Center for Education Innovations site was down for a short period for maintenance. However, it’s back up and running. You should be able to access the study here.

      Best,
      Kelly Toves, editor of R4D Insights

      Reply
  3. Maria S R Tokwani April 17, 2018 @ 7:33 am

    This is true Molly. I have attended these conferences for the past four years and the issues discussed above are true. These are three big research areas which all of us need to research in our respective countries or regions.

    Reply
    1. Molly Eberhardt April 19, 2018 @ 3:54 pm

      Maria, thanks so much for your comment. I’m sure through your work at World Vision and UNICEF you have run into both obstacles and successes in addressing these challenges. Would love to hear about some of the successes you’ve had in your work in Zimbabwe or elsewhere.

      Reply

Leave a Reply

Comment Guidelines

Your email address will not be published. Required fields are marked *

Global & Regional Initiatives

R4D is a globally recognized leader for designing initiatives that connect implementers, experts and funders across countries to build knowledge and get that knowledge into practice.