top of page

How do we know we’re making a difference? Let’s rethink, and invest in, what we’re measuring

When we considered how to evaluate Living Well, we quickly realised the current ways of collecting and evaluating data wouldn’t meet our aspirations to understand how a person had experienced support or a change in their life - or how ‘healthy’ the system itself was in delivering that support.

We also realised that, just as we place people at the heart of deciding what they want from a new system, so too we must place them at the heart of deciding what they think is worth measuring.

Over the past four years, we’ve been creating new systems of community mental health in Living Well sites across the UK including Edinburgh, Salford and Tameside and Glossop. So far the programme has supported some 2,500 people who might not otherwise have been helped and generated over five million pounds of additional investment in local mental health systems.

Inspired by a model developed by Innovation Unit in partnership with Lambeth, South London, Living Well sites have been designing, testing and learning new ways to build different and better systems of support, thanks to funding from The National Lottery Community Fund, the largest funder of community activity in the UK.

Telling the whole story

Bringing together several systems to provide holistic support threw up many problems when it came to measuring impact:

Consistency: each system collected and evaluated data in different ways

Relevance: what systems chose to evaluate didn’t delve deep enough into Living Well’s new ways of working

Context: evaluation reports didn’t always cross-reference other sources of data or come with a narrative to tell the full story

Disruption: ability to collect data was hampered due to people’s time and COVID

Infrastructure: investment in technology to collect and analyse data hasn’t been made

Skills and capacity: investment in training and time on collecting analysing data and consider its impact hasn’t been prioritised

Rather than attempting to use or and improve current data that is collected, we decided to co-create a Living Well outcome framework by starting from our first principles - working with people who benefit from, and who deliver, the support.

Standard approaches to evaluation focus on counting undesirable outcomes - for example how many people attend A&E - and rarely cross-reference other sources of data to build a bigger picture or indeed come with a narrative to frame the statistics.

We decided that not only would we capture data which measured positive outcomes, we also wanted to listen to people’s experience of the system as key part of our evaluation.

Our ‘holistic’ framework aimed to tell the whole story of both the impact on the people accessing support and the ‘health’ of the system around those people - combining quantitative data in-depth feedback and ethnographic stories.

Creating an evaluation framework

Living Well Systems start with the people they are designed to benefit. Our research, conversations and work with people with lived experience, staff and system leaders help us understand the challenges in the current system and co-create our vision for a better mental health system.

We facilitate a collaborative space where stakeholders reflect on the story of their system, what is really happening now, and make an honest assessment of what needs to change.

Across our sites, Collaboratives were set up which included system leaders in multiple sectors, voluntary and statutory practitioners and people with lived experience and carers. These collaboratives held the vision on behalf of the wider systems - and crafted high level outcomes and some specific goals that matched their ambition for peoples’ experience and the impact of the new network of offers.

Lambeth Big Three:

From the high level outcomes and specific goals, we co-created outcome frameworks - detailed theories of change, starting with high level aims then evaluation outcomes before finally specifying what data and intelligence would be required to demonstrate success.

Every framework focused on two areas: system level outcomes describing the impact of the new offer on service usage; and person level outcomes that provided insight into people's experience of the Living Well offer and the impact in their lives.

Making data simple but meaningful

In our impact blog, we recognise the difficulties and barriers our sites faced with gathering data, and measuring outcomes and quality. We know data collection on the ground is hard to implement and takes time to analyse and act on findings.

Our practitioners were often primary data gatherers in addition to their daily work which, despite best efforts, often leading to incomplete data sets. COVID diminished opportunities for data collection, with heavy reliance on email communication, telephone and even post.

Our new ways of collecting data were underpinned by two principles:

  • limiting the number of questions we asked a person that were for the system’s benefit not theirs

  • providing only the data that would deepen working teams’ understanding of: who they were supporting and their needs; what they're doing well; and what opportunities exist for practice improvement and system development.

To evaluate impact on people who benefit from support, sites chose an option from a pool of standardised, co-produced and holistic self-reporting tools - e.g. REqol and MANSA - and viewed these results alongside data from people’s feedback on how they’d progressed against their own goals set at the start of their support and measured by a goal attainment tool.

Such integrated evaluation also allowed people and practitioners to understand their current situation, areas they may wish to focus on, a person's strengths and preferences and - if used at the beginning and end of support - a person can see their own successes and progress.

To evaluate the system - and to hold ourselves to account against our vision - we wanted to move from a numbers game of how many people we helped to truly understand their experience of support.

We balanced our mix of quantitative measures - including demographics, reasons for seeking support, who made the referral etc - with stories of people's experiences of support now and previously, eliciting the magic and miserable moments of their time with Living Well, their progress and, most importantly, insight into their strengths and aspirations, recommendations and feedback.

Wide-ranging continuous learning

Another important and intentional feature of Living Well’s evaluation culture was bringing data out of dry performance or contract management meetings with senior system leads and into the view of interested system stakeholders.

We began to build a three-dimensional picture of the system which we shared across stakeholders, helping them view the system as a ‘living’ amalgam of people and their collective endeavour and giving them confidence to design spaces to deal with issues such as responding to inequalities or handling the co-location of staff.

Such open, transparent warts and all approach to data aimed to give a shared and realistic understanding of the system - not simply to hold people to account but mainly to inspire continued learning, innovation and improvement.

What have we learned?

Like with almost everything in complex system transformation you are never the finished article - but what have we discovered so far?

System data is more useful for transformation and collaborative commissioning than team or service data alone. As we look ahead and consider what opportunities the development of Integrated Care Systems can offer, one can hope data collection and shared data will become the norm.

Data must be accessible, transparent and understood by multiple stakeholders including people using services and those with lived experience, warts and all, to enable system thinking.

If we want to move to true co-commissioning models in the future we need to demystify data and open opportunities for transparency and discussion. There are many myths and misconceptions about what happens in organisations or offers e.g decisions about ineligibility for services.

Holistic data considering activity and experience, can uncover practitioner and customer behaviour and unlock understanding of how we as humans drive demand and influence service usage.

Data at the fingertips of practitioners leads to increased understanding of real time opportunities for practice development and how the workforce can be shaped to meet local needs.

Data collection remains difficult. We need to prioritise data that measures what matters to people and bring data collection methods into the 21st century so that it is a valuable resource to support systems to put people at the heart of everything we do.

We’ve only just begun our journey to capture and lift marginalised voices as a means of uncovering and addressing pervasive inequalities in access, experience and outcomes.

We want to broaden our ambitions for local populations - so we need to start paying attention to multi-level data that describes person-centred outcomes, system level quantitative and qualitative data and puts it in the context of trends in wider determinants and population health indicators.

When done well, data provides opportunities to make better decisions, to improve outcomes and to make a real difference to people’s lives. We need the skills and capacity to know what to collect, how to analyse and draw insight, how to act on findings - and we need the tech to support our work. Let’s invest.

 

If you’d like to talk to us about Living Well, including developing a programme in your area, then please email lwuk@innovationunit.org and we’d be delighted to start a conversation.

Living Well UK Programme is funded by The National Lottery Community Fund, the largest funder of community activity in the UK.



bottom of page