Co-production and lightbulb moments: A good day as an evaluator

‘Lightbulb moments’ that result from practitioners and researchers working together help to create shared implementation knowledge that is an essential part of understanding what makes an intervention ‘work’. Guest contributor Sarah Brand reflects on how realist research enables co-production and is an enjoyable process of arriving at new and meaningful implementation knowledge.

A key challenge and delight as an evaluator working closely with policy and practice in health and social care is in finding ways to talk about what works that helps to build and put in to practice co-produced knowledge. A good day in research for me is one that involves lightbulb moments for practitioners and researchers as a result of working together. These moments make grappling with complexity worthwhile when developing and using knowledge.

For me, yesterday was one of those good days. I was catching up with Mel Meindl who heads up a team (Laura Mayhew Manistre, Chloe O’Donnell, and Lorna Stabler) working on one of a package of rapid realist reviews I lead as part of a larger programme of research (What Works Centre for Children’s Social Care: Evidence Store). Mel relayed a conversation she had with a group of practitioners. Mel was using her initial understanding, developed from literature identified in our scoping review, to co-produce knowledge with the practitioners about how an intervention works for different families and in different settings. In the midst of discussion, Mel noticed that her understanding jarred with the practitioners’ understanding, which came from their lived experience of the intervention in practice.

Like any good evaluator Mel used this as an opportunity to explore. Here’s the rub: from the literature about how the intervention works it seems that practitioners working in a new way improves relationships between practitioners and families. In the discussion it became clear that the practitioners experienced no difference in the way they worked with families in the intervention, but they did find their relationships with families were more positive. Exploring why, both sides had a lightbulb moment: ‘Ah!’ Mel said (and I paraphrase), ‘so you work in the same way with families whether they’re in the intervention or not [nods all round] but the wider intervention structure means that you have more time with each family and in less formal settings, which allows a better relationship to develop?’ ‘Yes!’ was the delighted and unanimous reply ‘that’s exactly it, we didn’t know how to explain it, but that sounds right’.

Making explicit how things work

It may seem obvious, but making knowledge about the key ways that an intervention works explicit can be a powerful collaborative mechanism. It helps us to transfer learning to implement the intervention successfully in a new team in a new setting. In this case, not by putting resource in to training to work in more relationship-focused ways, which we might have surmised from the literature alone, but rather in to structural changes that enable practitioners to work in ways they know work better for families. This may be by making sure implementation in the new setting includes the removal of local organisational constraints to practitioners having more time with families in less formal settings.

Realist co-production

For me, this story (poetic licence applied) emphasises co-production at its finest. It highlights what I feel realist review and evaluation can add to the partnership between research, practice, and policy in developing and using knowledge. Using repeated cycles of data collection a developing theory about what works about the intervention, for whom, under which circumstances guides the data collection conversations that an evaluator chooses to have with different stakeholders (practitioners, managers, families, young people, service-users, policy-makers…). In this way, realist approaches bring together and test out a range of perspectives about the key ingredients that make an intervention work, and what it is that allows it to work in this ideal way (or not). All people an evaluator talks to are valued equally as experts on at least part of the theory about how the intervention works, and all of their voices are included in the theory of how the intervention works.

Bite-size evidence

From working with realist approaches now for many years, for me the beauty of the process through which a realist project embraces the complexity faced by evaluators and implementers is this: from the simple rules about how an intervention resource (e.g. training or new ways of working with families) interacts with people’s history and circumstances (realist context-mechanism-outcome configurations; CMOs) and the complex inter-relatedness between these CMOs emerges a programme theory. This programme theory is greater than the sum of its parts, and almost always surprising in some way (if it isn’t surprising, I’m worried).

Realist approaches break down theories from a range of sources and perspectives about how an intervention works in to bite-size and easily comprehensible pieces of theory (CMOs, sometimes for ease in the form of if-then statement, e.g. Pearson et al., 2015). Each explains some part of how the intervention works through its interactions with the internal worlds and circumstances of different people (social workers, families, children). These bite-size theories form a productive basis for discussion with different stakeholders with research jargon held well at bay. In this way the partnership can hold up, scrutinise, and reflect upon these bite-size theories, and bring them together into a ‘programme theory’ containing multiple voices and contexts important in how that intervention works and what aspects of a setting are key in allowing that. It is this articulation of something beyond what any of the participants or researchers could have produced alone, this emergence of something new and meaningful, that I find continually rewarding.

Dr Sarah L. Brand (@SarahLBrand) works at CASCADE: Children’s Social Care Research and Development Centre, Cardiff University.

Sarah leads a work-package of realist projects within the wider programme of research CASCADE are undertaking as the research partner for the DfE funded What Works Centre for Children’s Social Care.

© UK Implementation Society, 2019

All views expressed are the author's own and not those of the UK Implementation Society.

Recent Posts
Search By Tags