Management, Management & leadership, Project management, Strategy, Strategy and planning

Valuing evaluation: ten top tips

Miranda Lewis explains her top ten tips for evaluation.

At its most straightforward, evaluation should be an integral part of any organisational strategy or project cycle: a means of reflecting on experience; of understanding what works and what doesn’t (and why); and of taking learning into planning and delivery. Public and political pressures on charities to be transparent and to provide value for money have increased the focus upon the need for evaluation – but this is not necessarily matched by the capacity and resource hard-pressed organisations require in order to undertake this kind of learning.

Based on m2 consultants’ experience of working with charities and foundations to evaluate their work – and through running evaluation training – we’ve developed ten top tips for anyone wishing to understand or evidence the value of their work.

1. Know what you want to know

Any effective learning process will have a set of questions at its heart. Making sure these are the right questions will mean you get the answers you need. Do you want to understand how your project could run more effectively? Which elements of a particular project matter most to service users? How might a new service need to be adapted for delivery in different areas? The kind of question you are seeking to answer will drive the approach required.

2. Measure what matters

Asking the right questions is important, but so is working out what matters most when it comes to data gathering. Prioritising your ‘must have’ questions over those that would be ‘nice to know’, will help you to focus energy on collecting the key bits of information that will enable you to develop your product or service.

3. Think practical

Good evaluations rely on good data. Will you survey key contacts? Speak to them? Map your stakeholders? Interview service users? Working out how long this will all take, how to do it and how to store the data is critical. An easily overlooked source of data is anecdotal feedback – if a member of staff goes to a meeting and is told that your organisation’s report changed the way something was done, is that being captured? And of course, everything needs to be aligned with GDPR requirements.

4. It’s all in the timing

When do you need to use the findings? Is it about course correction, or demonstrating impact (or both)? A process evaluation (sometimes called a developmental evaluation) approach sits alongside a project team, feeding back on how it is running and gathering immediate reactions to the work. This is can sometimes feed into a longer-term impact evaluation, or can be stand alone – particularly for a pilot or new area of delivery.

5. It’s about the people

Service users, staff, volunteers, trustees and external partners may all have very different perspectives and understandings of how a particular service or project is being delivered, and what the priorities are. Ensuring all their perspectives are included in an evaluation will give you a much more complete picture.

6. Output or outcome?

It’s easy to count the number of anti-malarial bed nets distributed (the output) but much harder to know if they have been used in the right way (the outcome), and therefore how many malaria cases were prevented. Both types of data can be useful, but it is important to be cautious in extrapolating outcomes from outputs.

7. Contribution or attribution?

Services are delivered in a complex environment in which it is very challenging to completely isolate specific effects or firmly attribute changes to a particular intervention. It can be more helpful to think about contribution – in what ways does your service contribute to a broader shift and where does it add most value?

8. What didn’t work?

Particularly when evaluations are funder-led, there can be a sense that it’s important to demonstrate what has worked. But knowing what hasn’t worked, and why, can be even more important when it comes to designing the next stage of a programme, project or intervention. If you have commissioned an external evaluation, make sure you choose people who you are confident will feed back the full picture, including any challenging views or comments.

9. Publish and be dammed?

In this transparency-focused era, whilst acknowledging there will be sensitive areas around privacy, say, or brand reputation, we believe that wherever possible evaluation findings should be made public to help inform the wider sector. Even where findings reveal significant challenges (and what project or service is without challenges?!), putting these into the public domain demonstrates an organisation open to learning – particularly if published alongside a plan for addressing the challenges.

10. When good enough is good enough…

In m2’s experience, small organisations can feel under so much pressure to evaluate that they either get bogged down in overly complex evaluations of relatively small projects or get completely put off the whole idea, despite knowing that the learning would be useful. Stripped back to its basics, evaluation is a matter of asking the right questions, collecting the right data, storing this correctly and putting it all together in a way that adds strategic and operational value.

 

Tags

    Popular Posts

      Recent Posts