Five tips to get the most from your in-house evaluation

Five tips to get the most from your in-house evaluation

By Morris Hargreaves McIntyre


Lorna Dennison, senior consultant, Morris Hargreaves McIntyre shows us some simple ways to stop evaluation just being a tick box exercise and turn it into rocket fuel for future improvement.

This year cultural organisations have been forced to do things differently, to rethink their relationships with their audiences and challenge their assumptions of what’s important to them.

In response, we’ve seen an explosion of creativity and experimentation from the performing arts, museums and art gallery exhibitions and others. As a sector we’ve rapidly evolved from dumping collections online to creating bespoke cultural offers that embrace the restrictions forced upon us.

But how to tell if your hard work is paying off — and why (or why not) your creative experiments worked? How can arts organisations learn, build and improve?

Evaluation is sometimes viewed as a box-ticking exercise but, if approached in the right way, it can be rocket fuel for future improvement.

At Morris Hargreaves McIntyre (MHM) we’ve seen a big increase in evaluation briefs but not every idea needs, or has the budget for, a full-scale third-party evaluation project.

If your organisation is planning to evaluate your creative projects in-house, here are MHM’s top tips to get the most out of your efforts:

1. Starting thinking about evaluation early

That’s not to say it’s impossible to evaluate after the fact — it happens often (and we’ve certainly done it a few times when that was the only option). But it is always easier, better — and usually cheaper — if you’ve planned it from the beginning.

2. Start by asking — what are you trying to achieve?

Try to look beyond merely recording what happened and instead consider how you can apply the evaluation findings. Did you prove your hypothesis? How can you improve your practice?

But, also, remember to make space for unintended outcomes: allow an opportunity for people to talk about experiences that you might not have anticipated.

3. Take an audience-focused approach

You must understand your audience’s needs when planning your methods. Make sure there is a diverse range of voices in the early stages of design, provide alternatives for access needs, and don’t assume digital is the preference (or possible!) for everyone.

In some cases, there is also a duty for co-creation — ensuring evaluation is done with your stakeholders, not to them. For example, for a programme that involves young people as invested participants, they should have a voice in how the evaluation is delivered.

Top tip: balance the ask of the evaluation with how the respondents have been involved. It’s fair to expect participants in a year-long program to give you 45 minutes of their time to feed back. For one-off event audiences, the ask should be much less.

4. Mix your methods to cover a range of outcomes, efficiently

But there are many valuable guides and frameworks (we use a strategy tree and method matrix, but there is also theory of change and others). Do some research to find the best one for you.

Mixing qualitative and quantitative methods is almost always vital (even if it’s just including some open questions in a survey) to ensure you have both robust and rich data.

Think about what options you already have for data collection and piggy-back where you can: adding a couple of questions to an existing survey, sign-up form or discussion will save you effort and duplication.

5. Push for objectivity, challenge your assumptions

The ideal is full independence — an external evaluator (like us) who’ll consider the big picture from an unbiased perspective and draw clear, insight-based conclusions. If that’s not possible, keep notes throughout of your own thoughts and assumptions — write them down early so they don’t influence you later without you realising.

Appoint someone more distant as a critical friend to ask the tough questions and challenge your potential bias. Look at your data with a critical eye — take care not to infer what isn’t here (particular with small sample sizes).

Good luck!

Lorna Dennison, senior consultant, Morris Hargreaves McIntyre 

Browse by learning pillars
Core Skills Marketing Planning
Browse by smart tags
Resource type: Articles | Published: 2021