Close Menu Subscribe
Close Search
March 4, 2015

Look before you leap into buying outcomes measurement software

The right software can make outcomes measurement and evaluation a lot easier. But how do you select the best outcomes measurement software for your organisation?

Social purpose organisations and social enterprises are increasingly committed to measuring and evaluating their outcomes in the community. Leading social sector organisations are making investments in information technology platforms, applications and systems to improve the outcomes measurement and evaluation process. This software promises to make outcomes data collection quicker and easier, and make outcomes reporting more straight-forward and illuminating.

So what do you need to consider before investing in software to measure and report on outcomes?

Four steps for choosing the right outcomes software

With many different products on the market (see below) selecting the right one for your organisation can seem overwhelming. There are three steps before looking at your outcomes software options; the final step is investigating and piloting your choice. An example from SVA Consulting’s work with YMCA Victoria demonstrates the steps in practice.

1. Agree your objectives for measuring outcomes

The program logic[1] informs what you need to measure and evaluate to prove and improve your impact. If you want to prove your impact you need to determine who you are proving to, and what evidence they need.

In most cases this will be proving to a funder that your program or service achieves the intended outcomes. Government funders may need a multi-year evaluation using data from a statistically significant sample of participants. A private investor might be most interested in outcomes generated (e.g. additional earnings of previously long-term unemployed person working in a social enterprise) and be satisfied with fewer measures collected from a smaller sample over a shorter period of time.

Better measurement of social impact is one of our key strategic goals… because it helps us with continuous improvement.

It is likely that you will also want to measure and evaluate outcomes to learn how to improve your impact (see Stop proving; start improving!). This involves comparing the outcomes data to the program logic to identify what is and what is not working as expected. For example, if one of the priority outcomes on your program logic was to improve awareness of healthy eating options, yet 95 per cent of your participants had not improved their eating habits, it could indicate the healthy eating module of your program was not working.

YMCA Victoria’s Youth Services Innovation Manager, Paul Turner, outlines their objective: “Better measurement of social impact is one of our key strategic goals in Youth Services over the next three years because it helps us with continuous improvement. It allows us to better evaluate our programs to fine tune them and to make sure they are increasingly powerful.”

Based on this objective, the team knew that it needed to collect data that would allow tracking of progress towards targets, and also elucidate what is and is not working about the program.

2. Be clear on your outcomes

Work out the outcomes you want to achieve, and how you plan to generate them. Your impact is the sum of the different outcomes you generate. You may want to increase education levels, employment participation, or mental wellness by working at an individual level with families, groups or communities. Use the program logic approach to identify how your activities lead to the desired outcomes, and the outcomes to focus your measurement and evaluation on (see Finding the Golden Thread: a new approach to articulating program logic statements).

YMCA Victoria The Bridge Project
YMCA Victoria The Bridge Project wanted to identify the most significant changes in participants’  lives brought about by the Skills for Work and Vocational Pathways Program.

SVA Consulting supported YMCA Victoria’s The Bridge Project to start tracking the outcomes of its Skills for Work and Vocational Pathways Program. The YMCA team works with young people at risk of being trapped in a recurring cycle of crime and imprisonment. The team knew that the program was creating significant changes in the lives of men and women involved however, they had not pinned down what these changes were and which were most significant. The team shared anecdotes of the broad range of changes that had happened for participants as a result of the program. Some had drastically reduced their cigarette consumption; while others had stopped seeing people who were a negative influence in their lives. Others had signed up for vocational courses.

Using the program logic approach we narrowed down to five main outcomes that the program was helping participants to achieve: increased confidence and self-belief; healthier lifestyles; improved relationships; more engaged in education or employment; and reach full potential. These outcomes capture the essence of the changes that the program is generating.

3. Identify what to monitor, and how

The third step is to agree on a realistic plan for how you will demonstrate the outcomes (identified in the program logic) have been achieved. You will need to think about what you can measure or observe to know how much progress (if any) has been made towards the outcomes. This is also the time to take a reality check and consider: What level of data provision and data entry will your clients put up with? What will your staff be able to fit in? What do we need to report to our funders?

We worked with YMCA to identify two to three indicators per outcome and set targets for the most pertinent indicators.

There were a number of different approaches that the YMCA team could have taken to monitor the five main outcomes. For each outcome we considered whether there was a direct way of indicating if the outcome had been achieved (e.g. attendance at work to indicate that participants were more engaged in education or employment), or achieved indirectly (e.g. observation of the participant’s conduct at work). The nature of the program (a five day course spread over one to five weeks) meant we needed to be realistic about the extent of change we could expect to occur within the timeframe of the program overall, and the amount of time available to capture data on these changes.

We worked with YMCA to identify two to three indicators per outcome and set targets for the most pertinent indicators. For example, for the outcome ‘more engaged in education or employment’ the indicators selected were having a current resume, getting a driver’s license, establishing a daily routine, taking more care with appearance, and education or employment status. All but the last indicator are the foundational components to engage with education and employment and are feasible for the short program to generate. YMCA aimed for 50 per cent of graduates to increase their level of engagement in education or employment three months post-program – reflected by at least one of the indicators being achieved (e.g. writing a resume).

4. Investigate and pilot software

Once you know who you need to collect data from and what you need to collect, what analysis you need to do with this data, and what reports your audience needs, you are ready to choose and trial the software that best fits. Being clear about these parameters will point you towards the outcomes software that is right for your organisation.

Precious time and resources can be saved by piloting one or more products in a part of your organisation before refining and rolling out more broadly.

There are practicalities you need to consider when weighing up the options (see the Table below):

  • Can the software be integrated with your existing IT systems?
  • Can you invest in the hardware needed to get most value from the software?
  • What is the capacity of the software in relation to the size of your client base?
  • What are the cost implications of increasing use of the software as you grow?
  • Are your staff sufficiently tech-savvy to use the software?
  • Can they get support to use the software effectively?

Precious time and resources can be saved by piloting one or more products in a part of your organisation before refining and rolling out more broadly. You might want to assess if this software is going to meet your specifications off the shelf, or how the software can be modified and implemented to meet your needs.

By working through the first three steps, YMCA Victoria knew it needed software that could demonstrate the five outcomes that the program had created and provide insights into areas for improvement, such as course structure, course duration and targeting of participants. To do this the team required a product that could collect data from participants and staff, and produce regular reports that were quick and easy to understand.

The technology opened up the possibility of collecting lots of data quickly and cheaply.

YMCA chose to pilot the Socialsuite platform. YMCA’s The Bridge Project Manager, Mick Cronin, explains: “During the pilot, we collected data from participants on every day of the seven courses using the Socialsuite platform. This ensured that we captured the immediate outcomes of the programs, and that data collection was comprehensive. Throughout the courses we reviewed the data collected in real time. At the end of the pilot, the team [SVA Consulting, YMCA Victoria and Socialsuite] met to make sense of the evidence collected to start to draw conclusions about how and why the courses made a difference.”

In particular, the team identified differences in the outcomes of participants undertaking the intensive five-day and weekly five week courses; and in different locations across the state.

The technology opened up the possibility of collecting lots of data quickly and cheaply. Initially, the team was enthusiastic to collect a wide range of data and was diligent in sticking to the original plan even though it was challenging. After the pilot the number of data items was cut back to the essentials for each of the main outcomes, and dropped the frequency of collection for some. This made the data collection approach more sustainable and targeted, post-pilot.

… we identified that the reports needed to clearly… show the relationship between the outcomes achieved and the targets set for the outcomes.

The pilot reporting was useful in showing what outcomes had been achieved for individuals, course cohorts and in different locations. To assist the team and others to understand the story of the changes generated by the program post-pilot, we identified that the reports needed to clearly spell out the situation for participants pre- and post-program and show the relationship between the outcomes achieved and the targets set for the outcomes.

This collaborative effort won the Social Impact Measurement Network of Australia’s 2014 Excellence in Innovation in the Social Impact Measurement Award.

What software exists for outcomes measurement and reporting?

SVA Consulting has investigated a selection of outcomes measurement software for social purpose organisations. We have trialed them, spoken with company representatives and their clients, and had demonstrations. There are some similarities between the products: all are cloud-based, all generate reports, and most integrate with other systems to import and export data. The main distinctions between the systems are:

  • Who enters data,
  • How it is structured, and
  • What it can do.

The following table shows the diversity of the software available. This is not an exhaustive list. What’s important to remember here is that it is not the number of boxes ticked that matters, but whether the product is the right match for your needs.

 

Outcomes-software-TABLE
Table of outcomes measurement software for social purpose organisations Note: Software is updating and improving all the time. This table is current at February 2015.

A further distinction is that some software systems are tied to proprietary outcomes measurement frameworks: the Outcomes Star Online is linked to the Outcomes Star case management tool; and Results Scorecard is linked to the Results Based Accountability methodology. Sinzer and Social Return Intelligence Suite are particularly relevant for organisations that would like to calculate their Social Return on Investment.

Next steps: let’s share and learn!

Outcomes software provides the potential to greatly improve any outcomes measurement process but unless you go through the first three steps, it is unlikely you will design the best outcomes framework and therefore implement the best software. Being clear about your purpose for measuring outcomes, what the outcomes are, and what to monitor, is critical.

Outcomes measurement software is an emerging and fast growing area – including the number of options on the market, the capability of the options, and the ways that they are being used. If you have made the leap to using outcomes software we are keen to know what’s working and what’s not working for you. Please share your experiences below or email us at consulting@socialventures.com.au.


Acknowledgement

SVA Consulting would like to thank the following people for their contributions to this article: Paul Turner and Mick Cronin (YMCA Victoria), Greg Simmons (Blackbaud), Dana Fox (Athena), Adam Luecking and Maya Romic (Results Leadership), Jasna Tesevic (Anicha Consulting), Marlon van Dijk and Menko Busch (Sinzer), Clara Ong and Damian Hadja (Socialsuite), and Anshula Chowdury (Social Assets Measurement), Jack North, José Antonio Nuño Pérez, Liam Gage Brown (SVA Consulting interns).


Endnotes

1. Program logic describes the logic of a project or program. It consists of program logic statements which articulate how the activities lead to the consequences that will drive the outcomes that you want to achieve. For more on program logic, see Finding the golden thread

Back to top