Peter Stanford: Measure your impact with a splash of common sense

Detailed evidence can be expensive and unnecessary, says our columnist

Peter Stanford
Peter Stanford

It wasn't one of my happier meetings with a potential funder.

A well-known grant-giving trust (to name it risks making this article an argument about how it is currently run, and that would be a distraction) had invited me and a couple of trustees from the Longford Trust to come in and talk about our scholarships scheme.

We ran through how we make awards of financial support and mentoring to young ex-offenders to help continue their rehabilitation by going to university.

We provided case studies stretching back over six years, figures for those we had supported who then got degrees and jobs, and for those who dropped out. Pretty comprehensive, I imagined.

But we were asked: "Where's your impact measurement?" I quoted again the 'outcome' figures for our award holders (which compared well with reoffending rates for released prisoners).

Our interrogator looked irritated. But those are just your figures, she told us. Unless they are independently verified, they have no value. I took this to mean she thought we had made them up but, as politely as I could, I asked how we might 'verify' our impact figures. By working with an academic institution over a number of years, came the reply.

A couple of problems with that: first, we would be spending money we have been given to support scholars on paying academics to tell us what we already know to be true; and, second, it would take an eternity - and we would need more funding to answer the requests that would be piling up at our door from people who wanted to be on the scheme.

I think I must have shown my frustration. What I don't understand, we were told from the other side of the table, is your 'core philosophy'. That's easy, I replied, it is that education promotes opportunity and rehabilitation.

But, she asked, how do you prove it? What studies have you undertaken to prove that education has an impact?

We didn't part friends and we didn't get the grant. Ever since, I have been suspicious of what New Philanthropy Capital has recently referred to as "the explosion of interest" in impact measurement.

The think tank's latest report - which caught my eye because it looked at the pros and cons of impact measurement in the area of youth justice - highlights the struggle middle-sized and smaller charities have in providing data to show they have an impact.

Because they do not have the resources to employ specialist members of staff on impact measurement, they lose out on awards they badly need from grant-giving trusts run by desiccated calculating machines, such as the one we encountered.

Now, before you start writing in to complain, I realise, of course, that charities have to prove they make an impact.

We have a staff member at Aspire - a charity that supports people with spinal cord injuries - who regularly reminds the trustees when we are considering pressing issues in the lives of people with spinal cord injuries: "You believe it to be an issue, and it may well be an issue, but before we set up a programme to tackle it, we have to prove it is an issue."

Fair point, well made. There are too many well-intentioned projects, based on anecdotal evidence or a handful of hard cases that might not be typical, which cost a great deal and don't deliver. So, yes, let's measure our impact carefully and let's demand evidence.

The culture of impact measurement has a value. But, like all ideas of the moment, we need to approach it with caution, lest it become an end in itself or give rise to an industry that supports hundreds and thousands of consultants and academics with money that donors gave to support practical projects.

Balance is the key word. Better still, common sense, which remains one of my favourite phrases when it comes to our sector, and particularly the role of trustees.

We can't be experts in everything, and we cannot justify spending charity funds bringing in experts on everything. We have to use common sense to discern when we really do need a bit of specialist impact assessment and when we don't.

So now that my ruffled feathers are smoothed, I can see that our internal figures require some verification, otherwise I could just be making them up as I went along.

But do we need to commission an 'impact measurement' to work out if education can change lives? We might as well close down the whole schools network here and now.

Peter Stanford is a writer and broadcaster, chair of Aspire and director of the Longford Trust

Have you registered with us yet?

Register now to enjoy more articles and free email bulletins

Already registered?
Sign in
RSS Feed

Third Sector Insight

Sponsored webcasts, surveys and expert reports from Third Sector partners

Third Sector Logo

Get our bulletins. Read more articles. Join a growing community of Third Sector professionals

Register now