Round up for December 2019
Supporting the Focus on Dementia Portfolio
Introduction
Welcome to the latest roundup about our key learnings from research and evaluation for quality improvement and implementation in health and social care. In this issue, we focus on the Evidence and Evaluation for Improvement Team’s (EEvIT) work supporting one of the ihub’s portfolios, Focus on Dementia.
Embedding evaluation in design
The Focus on Dementia team routinely involve EEvIT in planning their improvement work, and as a result we can often contribute at an early stage. A current example is a collaborative seeking to improve outcomes for people with dementia in hospital, whether their admission is directly related to dementia or not.
Being involved early helped EEvIT to appreciate the many aspects of this improvement project that the team need to manage, alongside our central interest of evaluation. This has enabled us (with the team) to tailor our input to be most helpful in ensuring the design and delivery of an appropriate, and also practical, evaluation, which is meaningful to stakeholders.
In the early stages of the project, the team say that they valued the EEvIT input, in particular in helping understand the process of logic modelling. Our reflection on this was also that our questions in these sessions, geared to ensuring the team will be able to answer "how did it, or did not work" type evaluation questions, also facilitated their gaining more clarity on the detail of the logic of the programme design. Through this, together we were able to identify aspects that were likely to be key to success, or that we were uncertain about. These aspects are of particular interest to an evaluation seeking to answer questions about success factors and lessons for spread, but also for formative evaluation enabling uncertain aspects to be monitored more closely. Early discussion on evaluation also offered an opportunity to plan for the practicalities of collecting data to support evaluation in a way that hopefully minimises additional effort for participants. For instance, where monthly project management calls are taking place, building some time in for participants to reflect on how the project is going overall.
Evidence briefings
EEvIT evidence briefings are also helping to inform key areas of development. Read more about two, one of which is informing the development of guidance for group work in post-diagnostic services and the other which directly supported practitioners in testing a different approach.
Evidence and complexity: understanding the benefits of coordinated care for dementia
The Focus on Dementia team is also running a collaborative on dementia care coordination and EEvIT is supporting the team with both evidence and evaluation design. The overarching aim of the project is to take a whole system approach to achieving coordinated care at any point in time, and through time for a person living with dementia, from the stage of being diagnosed through to end of life care.
The Focus on Dementia team is working with one health and social care partnership (Inverclyde) to trial a whole system approach to achieving this with the ultimate intention of spreading good practice and learning to other partnerships. The whole system in this case is large, involving care and community services and groups in all sectors as well as people with dementia. In The Focus on Dementia team’s spirit of a co-production approach, the early stages of this project have seen stakeholders and participants explore issues of quite some breadth and complexity, and this has illustrated the need for a wide-ranging evidence base to support work going forward.
The design and delivery of any care services should be informed by evidence as far as is possible, but where there is complexity, peer-reviewed evidence will answer some questions but is unlikely to provide all the answers needed. Decision-makers will need, in that case, to also draw on local understanding and experience from service providers, care professionals and service users, and consider this alongside peer-reviewed evidence and the reported/evaluated practical experience of others. Whilst these sources might provide a rationale for testing a change, ultimately it is this testing in situ that can provide the evidence for improved local practice. And this is where evaluation can also add considerable value in understanding whether and how an intervention worked locally to improve care and outcomes.