How We Did It: Measuring the impact of community-led approaches
Real-time data is a powerful tool for councils. Throughout the life cycle of a project, data can show what works and what doesn’t, leading to better decision making and improved outcomes.
We put Lydia Hutchings and Beth Kilheeney in our ‘How We Did It’ hot seat to hear how the Greater Manchester Better Outcomes Partnership works with young people to prevent homelessness and how the Greater Manchester Combined Authority is evaluating the project.
Interview highlights
What was Greater Manchester experiencing in terms of homelessness?
Beth: Ending rough sleeping and preventing homelessness is one of our key objectives within the Great Manchester Combined Authority (GMCA). It’s something that the Mayor really prioritises.
The combined authority has invested in a number of high profile projects which seek to achieve this. What’s quite a challenge though, is that point of preventing homelessness and getting upstream to be able to get ahead of the problem. And we recognise that young people have a different set of support requirements.
What is Greater Manchester Better Outcomes Partnership (GMBOP)’s role in the project and its evaluation?
Lydia: It was identified alongside GMCA that existing services weren’t set up to specifically target young people. So, our project focuses on support for young people aged 18 to 25 and we work across all 10 boroughs within Greater Manchester.
And my role within the programme is to monitor the performance in real time. I work with the delivery partners to track and monitor all of the data throughout the programme – a young person’s journey from when they first come to our programme to when they hopefully sustain their accommodation at the 12 month point.
And we use that data to identify patterns and trends in real time. That means that we can make delivery as effective as possible and provide that feedback loop continually to our delivery partners to implement changes as and when needed.
What kind of information are you collecting?
Lydia: We collect information right from the point of referral, including information about why that young person is coming to us, where they are staying, why there’s currently a risk, information about their age, gender, sexuality – lots of information to get a picture of young people.
When they are enrolled on our service, we then gather more detailed information about that young person’s situation. When a coach is allocated to someone, they complete an initial support plan to really dig into and identify alongside that young person what it is that they need to stabilise their housing.
They ask more detailed questions about their current working situation, their income, their accommodation, mental or physical health conditions, any dependencies to substances or anything like that.
From that point on, we track all of the information on the young person’s journey – we monitor the communication between them and their coach, the actions that take place, the outcomes achieved, but also what we would call ‘added value’ outcomes. So that might be something that doesn’t fit the criteria of what we consider to be an outcome, but we recognise that it has added value for that young person.
We have a number of data points to track where that person is at each point in their journey, right up until the point of closure, so we also understand where that person has ended up after 12 months.
How did you decide on what you were going to measure?
Lydia: We have initial engagement outcomes to gather a deeper understanding of the people that have come to our service and their situation. Our wider engagement outcomes also serve as check-in points for the participants throughout their journey. We review cases on an ongoing basis to track any changes in their situation and how they are feeling. For example, we have housing outcomes because that’s the crux of what we’re trying to achieve. But we also have self-determined outcomes, which are outcomes that young people prioritise. These can be really flexible and tailored to whatever that young person identifies it is that they need.
The way that we came to these outcomes and decided what we wanted to base the programme on was through the data that we’d collected and what those young people were saying was a priority for them.
How does this data collection intercept with GMCA’s evaluation?
Beth: There are crossovers in places, but we are trying to achieve two slightly separate things. I’m doing an evaluation of the whole project to inform future commissioning and decision making, whereas Lydia and GMBOP are tracking patterns and trends over time.
It helps us understand how far upstream we can get in terms of prevention and where we need to be intervening.
GMCA is delivering a qualitative longitudinal evaluation. To do this, we speak to young people after they have sustained in their accommodation for six months and again six months after this. In the first round of interviews, we ask them about their experience prior to accessing the service, what led them to be at risk of homelessness. Even though this isn’t about the service itself, it’s really important because it helps us understand how far upstream we can get in terms of prevention and where we need to be intervening.
We ask about the individual’s relationship with their coach and what their experience of having one has been. We talk about the self-determined outcomes that Lydia mentioned, focusing on financial stability, support networks and meaningful activities.
We speak about their perception of the programme, what they thought about it when they were first referred, when a coach contacted them and how it’s impacted them. We can assess what changes have come from their engagement with the service, but how do they feel about it? Has it been a positive experience overall? Are there things that they would have done differently? And really importantly, we talk about the stability in relation to homelessness. Do they still seem to be at risk?
Six months after that, we talk through any changes since we last met. If they’ve disengaged with their coach at that point, which they’re likely to have done, how have they felt about that? Was that a difficult period of change? And we speak to them about whether the service intervened at the right time. Again, we can do some analysis at our end to understand if we think it did, but for the young person, do they think the timing was right.
How have young people been involved in both the delivery and the evaluation of the project?
Beth: It was really important to us to spend time scoping and planning the evaluation before jumping into it. And we did that internally at the combined authority with qualitative research experts and homelessness experts. But it was also really important to do that within the service. I went along to a coproduction panel and was able to speak to young people who use the service to understand any blind spots that we are likely to have because we don’t have first-hand experience of being part of the service.
I spoke to them about barriers that might stop them engaging in an interview for the evaluation. And I shared the questions with them prior to interviews so they could tell us that this might be a difficult thing to talk about, or you should ask that question in this way.
We frequently run coproduction sessions to ensure that we are continually involving the voices of young people in what we’re doing day to day.
I also went along to a peer support session, which is run by GMBOP, with lots of coaches and I did a very similar thing there. I explained the aims and objectives of the evaluation and got their insight as experts of the service on where we were missing things or if there were things they wanted to know from young people that we could integrate.
Lydia: We frequently run coproduction sessions to ensure that we are continually involving the voices of young people in what we’re doing day to day, making sure that we’re utilising their voices in how we’re designing and adapting the service to what they need.
When a young person comes to our service, they are very much in control of how they design the support from their coach, what they are prioritising, the steps that they feel are needed for them to stabilise their accommodation. In that sense, every young person is involved in the design of their support from the very beginning.
What would you say to local authorities who are thinking about doing something similar?
Beth: Getting to grips with the service is really crucial. That might sound quite obvious, but you might not work in it day to day. If I’d have gone into the peer support session with the coaches and hadn’t have had the basis of an understanding, that would have really let it down.
Engaging people to take part in an interview can be difficult. Initially we were advertising the opportunity to coaches. We had a period where we didn’t get a lot of uptake and then GMBOP sent an email to everybody who was eligible. Just missing out that middleman increased uptake straight away. So if you’re able to, contact young people or your cohort directly.
Lydia: I think the reason we’re really effective with our data is that it is also tied to effective processes that our coaches are delivering. And because we’ve got consistency along how we’re tracking all of the outcomes – we’re all using the same platform, and that also allows us to be really flexible and make changes to the data and processes as we need to.
The transparency element is also key. All delivery partners are on the same platform, which means that they all have access to the same data. We’ve implemented certain things to make it more accessible for them; we’ve created dashboards, for example, that allow them to quickly see what’s happening within their data.
Featured photo by Patrick Robert Doyle on Unsplash.
Join our mailing list