Grant strategies to build nonprofit capacity
Author: Kate Sunners
Last time I wrote about not letting your data die in a dusty old corner without being analysed. By the same token, the most horror-inducing thought for an evaluator is that a Monitoring and Evaluation (M&E) Framework that has been created for an organisation, isn’t regularly used and updated.
Monitoring and Evaluation Frameworks are the overarching why, what and how to of assessing an organisation or a program’s performance against objectives. They generally include Key Evaluation Questions (KEQs), a program logic, indicators against which progress is measured, data collection methods and timeframes, and evaluative processes and responsibilities. This means they can become quite large documents!
In order that all the hard work that goes into M&E Frameworks is put to good use, it’s essential that it becomes a living document. That is - that it gets updated as programs and goals evolve. For this to happen, it needs both the organisation to take ownership of it, and for there to be a specific position/person responsible to maintain the document. This sometimes requires some knowledge and capacity building!
We’ve recently been working with an organisation to design and implement a new M&E Framework to replace their ad hoc data collection. It’s exciting to hear they have already made changes to the document to mirror changes in the programmatic focus areas. They have also begun to develop new KEQs and indicators! Additionally, data collection methods are being reviewed with their program delivery staff to ensure survey and interview questions will be understood by participants. This really helps to build capacity and embed evaluation practices at all levels of the organisation.
In the case of M&E Frameworks, building a Frankenstein Monster with pieces added on when new programs or goals are added, is a good thing! It’s ALIVE!
Author: Kate Sunners
Does your organisation have methods in place to capture data on your programs? Great!
What about processes to analyse and use that data?
One of the things we often hear is that organisations have data from surveys, program delivery staff, beneficiary feedback, or even client records; but that it’s sitting in a folder somewhere in a deep dark corner, sad and alone and not providing useful insights to anyone.
It seems that the step many organisations miss, is the essential one that happens between collecting the data and sharing the outcomes. Analysis of your organisation’s data is absolutely vital to understand what’s working, what’s not and to inform efficient resource allocation. Data analysis is the process of bringing together the evidence that you have collected and making sense of it in order to understand how much change has happened, for whom, and why.
Nonprofits are stretched in terms of resources - so knowing how, and why, and in what circumstances our programs are maximally effective is a budget saver as well as quite literally, a life-saver for beneficiaries.
When you’ve collected a lot of information it can feel overwhelming and difficult to know where to begin to make sense of it all. My number one piece of advice is not to wait on analysis! Schedule time as soon as you have all your data collected together. Sit down for a large chunk of time, and review what you have.
Don’t put pressure on yourself to pull out reportable insights straight away –ask questions in order to get an objective picture, look at the story your data is telling you collectively. Trends will emerge. Scribble some notes on what it feels like the data is telling you, and then go back and look at the data again to see if your notes are backed up by evidence in the data. Some of your insights might leave you with more questions – and that’s great! These questions will help you guide data collection next time, or maybe you’ll decide you can answer them through a follow up interview or survey.
If you don’t have someone within your organisation who is skilled in pulling out evaluative insights from your program data, talk to us. We are happy to help with every stage of evaluation, from design of monitoring and evaluation frameworks, right through to data analysis and reporting.
Recent blog posts