April Journal Club Feedback
The new UK-IS Journal Club held its first meeting on 27th April. Hosted by UK-IS Chair Deborah Ghate, the group reviewed Lessons from complex interventions to improve health by Penelope Hawes (2015) Annual Review of Public Health 36:307-23. The group who attended – a mix of researchers, practitioners and post-graduate students – had a lively and enjoyable discussion, and this blog contains a summary of the take home messages from the meeting.
Complexity—resulting from interactions among many component parts—is a property of both the intervention and the context (or system) into which it is placed. Complexity increases the unpredictability of effects. Complexity invites new approaches to logic modeling, definitions of integrity and means of standardization, and evaluation. New metaphors and terminology are needed to capture the recognition that knowledge generation comes from the hands of practitioners/implementers as much as it comes from those usually playing the role of intervention researcher. Failure to acknowledge this may blind us to the very mechanisms we seek to understand. Researchers in clinical settings are documenting health improvement gains made as a consequence of complex systems thinking. Improvement science in clinical settings has much to offer researchers in population health.
There is so much in this paper that it’s hard to do it justice in this short recap of our discussion – it’s a paper jam-packed with quotable nuggets and “powerful ideas” (to use a term from the paper itself) on a wide range of implementation issues.
The general consensus in our mixed discussion group of researchers, practitioners and clinicians in health and social care was that this thought-provoking and stimulating paper should be required reading for all implementation students and professionals. A hugely important paper, it reminds us to remain open, humble and inclusive as well as rigorous in all that we do.
Beautifully and authoritatively written, distinguished Australian author Penelope Hawes strips back the layers to range widely over the challenges of working with complexity commonly encountered by those working in the implementation and improvement field. On the way, she tackles many of the sacred cows of the narrowly scientific view of evaluation, dissemination and knowledge transfer, noting that the influential terminologies and metaphors that shaped early approaches to closing the so-called science-to-practice gap have often been misleadingly linear and over-simplified. In trying to tame complexity – which Hawes contends is always a property of the setting, not the intervention – we may have blinded ourselves to the subtlety of the actual mechanisms we are trying to understand and describe.
Here are some of the take home messages collated from our very lively discussion:
We appreciated the discussion in the paper contrasting top-down and bottom-up approaches to developing effective and implementable interventions. Hawes stresses the vital importance of a co-creative approach that blends the perspectives of practitioners and clinicians with those of researchers: we always need both, at all stages of the process if we want to ensure we’ve captured the implementation realities of intervention in the ‘real world’.
Hawes talks in detail about the dangers of privileging intervention form over intervention function when trying to replicate or scale interventions in complex settings. The paper therefore has useful things to say about the constructs of fidelity and contextually-sensitive adaptation viewed through the lens of complexity. In the context of evidence-based programs and their spread, it led the group to talk about the importance of finding the ‘sweet spot’ between fidelity and adaptation as we delicately negotiate that balance between service users, program developers, practitioners, funders, and consultants.
Relatedly, there are some striking examples given in the paper of what happens when those doing the implementation on the ground are empowered to fully unpack not just what they’re doing but why. These examples resonated within the group. In particular, practitioner insights are the key to uncovering hidden implementation intentions and underlying core principles of working that in turn help us better identify the ‘core components’ of effective interventions and services. A less top-down approach empowers staff and buttresses long term sustainability of positive change.
However, asking for significant input from practitioners into implementation design and evaluation processes makes a demand on time and energy that can be hard to accommodate on top of everything else: organisational-level support and commitment are required, together with thoughtful oversight to avoid drift away from function.
We also noted that the job of a researcher is often to ‘make sense’ of complexity, and to distil out clear options and conclusions from the brew of complex systemic processes. Not all triallists, for example, would take kindly to be accused of being blind to complexity. And funders play a role here, often demanding reports and pithy conclusions that partial out complexity in order to clarify policy or practice directions – sometimes at the price of superimposing unwarranted certainty on real ambiguity.
We all (but especially academic authors) need to consider the dangers that can lurk in the terminologies and metaphors we are tempted to reach for when we try to liven up our descriptions of implementation and improvement processes. In particular, we noted the dangers of the narrowly mechanistic ‘pipeline’ metaphor; labelling our own approaches as ‘real’ as if everything else is ‘not real’; the inevitable whiff of condescension that accompanies the language of ‘translation’ and ‘transfer’; the dangers inherent in the linearity implied by terms like ‘logic’ or ‘pathway’, which obscure the actual mess and meandering that characterises the complex world… and so on. Above all, we must not become dogmatic and entrenched in a particular view of ‘how things work’ but must remain constantly open to other perspectives and new insights.
And finally, on the subject of language in particular - to the author’s thoughts on terminology we added a brand-new word (for me at least) that seems so aptly to describe much of what we do as implementation professionals that I can’t believe I hadn’t come across it before! Courtesy of a young canine member who joined the group briefly (the joys of working from home, people), we learned his name (Coddiwomple) was an Old English word meaning “to travel in a purposeful manner towards a destination not yet known”. As one member of the group commented, perhaps we may sometimes need to think about implementation (or implementability) before we think about outcomes. This is something that it hasn’t been fashionable to say for many years (think, the constant discourse of ‘outcomes-driven’), but it might save us from a lot of false starts, and also help us live with the inevitability of some missteps, wrong turns and dead-ends along the way. Food for thought.
With thanks to Tracey Finch, Annette Boaz, Nick Sevdalis, Tom Jefford, Carrie-Ann Black, Alex Ziemann, Carina Hibberd, and hosted/summarised by Deborah Ghate.
© UK Implementation Society, 2021
All views expressed are the author's own and not those of the UK Implementation Society.