We recently gave written and oral evidence to the House of Lords Select Committee on Charities. This was a crucial opportunity to ensure the views of small and medium-sized charities were represented in any discussions about sustainability of the sector and governance.
My oral evidence session focused on measuring impact and value. It seems to me that the bar for the third sector is often set higher than that demanded of other sectors. Small charities in particular face an ill-informed presumption that they’re not good at reporting or measuring their impact.
Why is this the case and how should those organisations that commission services from charities approach impact measurement in their contracts?
The charities that apply to us for funding all have systems for measuring the outcomes they aim to achieve with the people they work with. But although measuring impact is a prerequisite for our funding, we certainly don’t insist on how they do it, so we see different approaches and methods that tell different impact stories.
We ask them what they’ve achieved as part of our grant monitoring and – if they want it – we offer support so they can improve their approach and, ultimately, their service. But our approach is not about imposing accountability or treating charities with suspicion. Our focus is on encouraging a learning culture, where charities are supported to track success and failure, ask for "customer" feedback and act on what they learn to improve service.
Over the years I’ve met countless small charities and I am yet to meet one that isn’t interested in improving impact measurement. But they struggle to find any money to fund formal evaluations, so they focus on measures that relate directly to their day-to-day work. Many use industry-recognised methods, such as the Outcomes Star, that create a different, nuanced conversation with their beneficiaries at the point of "sale". Most public and private businesses would give their eye teeth for that degree of bespoke customer experience.
There is no doubt that, as a sector, we need to be much better at creating and recognising a more nuanced impact picture. In my evidence I cited three very different approaches from charities we fund.
One25 supports sex workers in Bristol and reports very hard outcomes in terms of housing and women stopping sex work.
Landau in Telford supports the integration of people with learning disabilities into mainstream life and reports a qualitative story of soft outcomes – like the young man who travelled on a bus independently for the first time.
Twenty Twenty in Loughborough supports disadvantaged young people into employment and education and has opted for a social return on investment model. It worked with Leicester City Council to note that for every £1 invested there is a return of 5 to 1
All are legitimate but none fits into convenient pigeonholes that can be aggregated up or commissioned down.
And that’s the problem. From speaking with small and medium-sized charities, we hear their experiences of working with commissioners obsessed with standardising how they measure impact across diverse services. Where accountability for local provision ends up being outsourced to a small charity, commissioners frequently insist on a payment-by-results contract with often damaging effects on the charity’s ability to deliver local services.
Payment by results might work for straightforward services such as meals on wheels or hours of domiciliary care, but not for those organisations working on complex issues. It's why effective domestic violence funding focuses on an individual’s holistic needs and works to address multiple underlying issues, rather than viewing domestic violence as a housing issue. And it's why it’s a nonsense to link payment to "unique contacts" – ignoring the fact that these problems are rarely fixed once.
Society can’t afford to have these services, or the impact they have on local communities, trivialised in this way. We need to measure impact through the eyes of a person receiving the service: as a good business would its consumers.
If we – local authorities, commissioners and funders – have done our due diligence correctly, we should have identified the organisation that is best placed to deliver the support individuals need. We need then to trust it to use its personal knowledge and experience of service users to decide the most appropriate impact measures. Good grant-making works that way.
Perhaps the question the House of Lords should be asking of all charities is this: whose impact measures should count? And it should recognise that the story small local charities present will look very different to that of large nationals. Perhaps then peers will see no shortage of great charities bettering lives – whether small or large.