Measure impact, learn and share
We know the impact that volunteers and volunteering can have. But how best to demonstrate that?
In this section there is guidance on how to plan to evaluate your volunteer programme and top tips on how to communicate the impact of your volunteers. Get Out Get Active local delivery partners also share their experiences. You’ll hear about the impact of their volunteers, the valuable role they play in reaching least active people and supporting disabled and non-disabled people to be active together for the Get Out Get Active programme.
On page 1 you will get top tips on measuring impact of volunteers, on page 2 you will find out how to communicate the impact of volunteers and on page 3 you can read, watch and listen about Get Out Get Active volunteers across the UK.
Top Tips on Measuring the Impact of Volunteering
Being able to showcase achievements with a great story is important for several reasons; it helps to get the best results for volunteers and proves the difference you make to the lives of people. This section sets out some tips to help you evaluate your volunteer programme.
Planning is by far the most important consideration. Research evidence planning approaches such as ‘theory of change’ – this process is often used with social interventions to describe the change you seek to achieve and the steps involved in making that change happen. These approaches enable you to connect your activities to your desired outcomes, impact and long term goals, which is useful for developing a measurement plan. But do be open and collect information in whatever format it comes – flexibility is the key and will help you gather information about outcomes and impact you may not have anticipated.
It is equally important to develop a statement of purpose and consider the reasons why you are collecting information because this can inform decisions you make further down the line. For example, if you want to measure impact to support income generation , you may decide to approach measurement in a way that is likely to appeal to commissioners (e.g., using validated industry standardised scales which are proven to measure things like wellbeing), whereas if you want to develop a volunteer programme, the questions will be quite different and may be more open and exploratory.
Select the right tools. There are a number of tools to assist you in the learning process. Some are designed to assist you in the planning stage, while others measure ‘distance travelled’ and therefore require baseline assessments and systems for follow-up. It is important to carefully consider in advance the evaluation question as this will profoundly influence which tool you decide to use. When selecting a tool consider the type of intervention, the timing of the evaluation, the skills of assessor(s), the resources available, the audience of the evaluation and the relative importance of the programme and assessment. The Inspiring Impact Resource Hub is a great place to find out about the tools available.
Involve volunteers. Put simply, involving volunteers in impact measurement yields better results. Volunteers are often seen as the subjects or objects of evaluation rather than as agents. Volunteer involvement could take the shape of crafting evaluation questions, data collection and reviewing the findings.
Being proportionate enables you to be focused on the information that is practical and relevant to decision making. Don’t try to collect everything about everyone because you may risk over burdening your volunteers. You may end up with superfluous information that you don’t end up using or is too difficult to analyse. It is important to think about what is realistic and achievable within the resource and scope of the programme.
Think about all the different types of data you can collect, and where from. Adopt a 360 degree approach, inviting feedback from as many stakeholders and perspectives as possible. This will help to give a rich, detailed and well-rounded picture of impact. Use a combination of quantitative (numbers) and qualitative (stories) research methods. Think about using both open and closed questions. Open to allow the respondent to express the issues most pertinent to them and closed to aggregate data for statistical purposes.
Be objective. Always be open, honest and transparent. Ask difficult questions to understand what would’ve happened anyway or what would’ve happened if we didn’t exist? There are so many factors that can influence an outcome and it is important that you can demonstrate your accountability and role. One way to explore your attribution would be to use existing data sources or comparative groups / organisations to benchmark. This helps to contextualise your findings and makes your data more meaningful.
Use evaluation findings to review and improve what you do. Use data as a learning tool. Ultimately, this helps to improve the quality of services for everyone.
Speak to experts and peers. Evaluation can be complex and requires a wide range of skills. Talk to academics and evaluation experts if you can. You may also find it beneficial to speak to organisations similar to you about what they’ve done.
Professional judgement is key. Data collection should never override or replace professional judgement. After all you are the experts and know your volunteers better than anyone!
- Draw on the evidence base and contribute / build the evidence base
- You may want to consider sampling techniques
- Understand research ethics (e.g. permission, use of data, right to withhold / withdraw etc.)
- Keep it simple
- Give thought to how you are presenting evaluation (e.g., data visualisation, compelling narrative etc.)
- Make it accessible – using innovative techniques such as audio and visual
- Consider whether to commission an external evaluation