Learning and recommendations
The following section summarises key learning and recommendations to be considered by teams and organisations taking forward outcome focused work.
Clarifying and mapping outcomes
Developing a theory of change and outcome map can be a challenging thing to do. Breaking the process down into manageable steps can really help. Useful activities you can do include:
- Using a tool like ISM Behaviour Change Tool to understand the context in which you work and how your service is adapted to manage threats and opportunities
- Ask everyone involved in the initiative ‘what does success look like to you’ and use this as a starting point to map your logic.
- Capture success stories and use these to understand how your initiative contributes to improved outcomes
Effective outcome maps express what the initiative does and not how it is evidenced. It is important to leave any considerations about what is measurable until after you have clarified your logic.
Because CA is a new approach to evaluation, it is important that teams understand this approach to evaluation and why it is a robust way to evaluate personal outcomes approaches. It is also important to share this information more widely across an organisation so that strategic leads, information specialists etc are confident in and comfortable with the approach.
Putting the approach into practice requires practitioners and managers to adapt their practice to:
- Capture different kinds of data
- Build in systematic processes of reflection, analysis and learning
Engaging people at an early stage in the process is important and leads to high level of commitment to the process.
Telling the story of the difference taking a personal outcomes approach makes involves bringing together qualitative and quantitative information. Ideally this would involve a mix of numerical data and more qualitative insights from a sample of records. The experience in Midlothian found that review of 30 case notes out of 800+ people being supported was enough to develop a clear and consistent picture of what was going on for those supported. This was further supported by insights from practitioners and primary care teams captured through reflective practice and action learning sessions. The following table summarises how Midlothian brought together quantitative data with a series of questions that they asked of their qualitative data to develop a robust and meaningful evidence base for their work.
- Possible sources of qualitative and quantitative evidence
Questions to ask of qualitative data
Number of people receiving the support
Key demographic information
Number of conversations
% of people for whom a personal outcome is identified
Change in reported confidence between first and last appointment (using 1-10 scale)
Change in reported wellbeing using WEMWEB scale
% of people making progress towards personal outcomes
Was the conversation perceived as good by the person and practitioner?
Did the person value the process? What did they like best? Could we do anything better?
What issues and outcomes were identified?
Did the person gain knowledge, confidence or skills?
Has this made a difference to how they live their life? What changes have they made?
What difference has this made to their life?
This process yielded some important learning about the use of Patient Reported Outcome Measures (PROM) in this context. PROMS, such as the WEMBWEB or IROC are attractive as they are validated, well recognised and can be persuasive for funders and commissioners of services. However, these tools are only useful for evaluating services if they are consistently used and recorded by practitioners. However, use of a PROM can fundamentally affect the nature of the conversation and the relationship the practitioner builds with the person. If projects are considering using PROMs as part of their recording it is important that practitioners are closely involved in this decision and that the use of the tool is piloted to ensure that the data will be consistently recorded and that any distorting effects are closely balanced with the benefits.
Embed a process of learning, reflection, discussion and adaptation
It is the process of analysis and learning that is the most valuable part of evaluation. There is no point spending time creating outcome maps and developing new data collection processes if the data is not analysed and used for learning and improvement. Therefore, it is important to incorporate analysis, reflection and learning into the plan for this work and when possible to inform these reflections by data and information gathered. It is important to adapt organisational processes to give time and space to engage everyone in these processes of analysis.