One of my earliest experiences as a new aid worker in 1998 involved a monitoring visit in northern Liberia. I had gone to check out progress on work to help returning refugees from the civil war rebuild their livelihoods. In particular, I was asked to find out why a project that provided pigs for breeding and onward distribution was not going well in one town, despite being largely successful elsewhere.
The mosque I saw upon entering the village proved to be the big clue to the community’s lack of interest in the gleaming new pigsty.
It was great that we had followed up to explore that anomaly in the monitoring data for the project, but if only more time had been taken to understand the different contexts and to adapt the work in the first place.
In the 20-plus years since then, I joined Save the Children UK, living overseas or rushing in and out of the latest emergency, and then took a variety of roles elsewhere focused on using evidence to help organisations get better at designing and learning from their work. As my perspective broadened from projects to organisations, I worked with Bond to help its members step up to the challenges of the "results agenda" and with the Independent Commission for Aid Impact, scrutinising the quality of the UK government’s aid spending.
Over time, my understanding of what drove successes and failures in learning grew. I was particularly struck by three things.
First, the risks of focusing too much on the next logframe target and donor report at the expense of discussing and reflecting on whether we really are making the difference we set out to make.
Second, the ability of totally dedicated and committed people to get pulled away from the "non-urgent-but-important" task of learning because of more urgent day-to-day demands.
Third, the importance of leaders who always ask why, focus on the end results and seek understanding of how to do better, rather than blame.
Now I find myself back at Save the Children UK in the newly created role of director of evidence and learning. It’s the first time it has a senior role dedicated to ensuring all that we do is more evidence-informed and strengthening the learning culture in the organisation.
In an organisation that spends more than £300m a year, as part of a global Save the Children movement spending more than $2bn (£1.5bn), using evidence and learning to improve the impact for children that we can squeeze out of each pound can make a huge difference. This won’t be a quick or easy journey, but I’ll be guided by two broad principles.
First, make everything about not just what we did, but what happened as a result of what we did, getting feedback, reflecting and changing as necessary. We need to remember that our KPIs, reports, monitoring and so on are materials to pique our curiosity and provoke conversations.
Second, listen to those who use – or should be using – our evidence and what their needs and constraints are. No one actively wants to not learn, and most people in our sector are hugely motivated by the prospect of making a difference. But we need to get smarter in thinking about behaviour and design principles – in addition to things such as methodological rigour – to maximise the chances that evidence really is used to drive improvements.
Throughout this next phase of our learning journey, we’ll share what’s going well and what’s not, and will proudly work with other organisations and donors who we in turn can learn from or who will support us along the way.
Michael O’Donnell is director of evidence and learning at Save the Children UK