About a bridge and a gap

Two weeks ago, I found myself at the Melbourne medical school in the midst of an estimated 40 GP’s. Don’t worry, my visit to Australia has not made me want to change my career dramatically. It just happened to be that the medical school hosted a short course on implementation science. Curious as I was about the Normalization Process Theory (NPT) that would be discussed, I was also interested to see in which way GP’s go around implementation processes. As soon as the first powerpoint slide was a picture of a gap and a bridge, I knew that, at least for the visual presentation of the concept, we were on the same page.

Special guest Carl May gave the introduction to the NPT model and the toolkit that comes with it. NPT focusses on the work that individuals or groups do to enable an intervention to become so embedded into routine practice it is normalized (Murray et al., 2010). With this strong focus on the user of the intervention, NPT distinguishes itself from theories and models that consider other determinants to be important for implementation. The model you choose to work with probably has a lot to do with your own conception of implementation, and with the growing field of implementation, more and more models are being introduced to choose from. So many even that it is hardly possible to know or heard of each one of them, let alone to know how effective they are.

It is therefore not remarkable that implementation networks are popping up everywhere. There is a need for a centralization of the knowledge on implementation. I am a big fan of networks that combine and share this kind of knowledge. As a matter of fact, I am one of the initiators of the Netherlands Implementation Collaborative and take part in meetings of the European Implementation Collaborative. One of the main difficulties however that I encounter in the networks is the reticence to name and share specific models of implementation. The reason heard is that these networks don’t want to put forward a model in order not to yield the idea that this is the one and only model that should be used. Fair enough one may think. I am however a little bit worried that we are letting a thousand flowers bloom. What I would like, is to establish a database of effective implementation models and instruments, just like the Netherlands Youth Institute has for youth interventions.

We want interventions to be evidence-based, why do we not hold the same for the implementation models and frameworks that we use to implement these interventions? What about evidence-based implementation?

sydney-harbour-bridge-opera-house-1671-1

Comparing apples with pears?

One of the reasons I am in Australia is to meet with researchers and experts to exchange ideas and knowledge. My first meeting was with Bianca Albers who moved from Denmark to Melbourne last November and now works at the Parenting Research Centre and is pursuing a PhD at the University of Melbourne. Over a bowl of pumpkin curry in one of Melbourne’s lovely restaurants, I asked her about her first impression in regards to the implementation of evidence-based youth care interventions in Australia. As I soon found out, trying to define how ‘Australia’ works in regard to this topic is, unsurprisingly, impossible. Expecting Australia as a whole country to have one way of doing things is like expecting the whole of Europe to act as one. With my interest in treatment integrity, this realisation triggered the question of how one should consider differences between and within countries in respect to the measurement of treatment integrity.

                                                                        australia-v-europe

The next week I was asked to present at a seminar on the topic of Creating effective support systems for establishing and maintaining treatment integrity, at the University of Melbourne. However, at the seminar I found myself in the midst of experts from 4 different countries with all having an interest in cross country comparison of measurement of treatment integrity. We soon found ourselves talking about measurements instead of support systems. Specifically the threshold scores of treatment integrity levels, or to put is otherwise, at what score therapist are considered to be treatment integer and positive outcomes can be expected. Developers of a treatment integrity instrument will have the threshold score defined and some interventions use these scores for (re)certification purposes of therapists. With a lot of interventions used in different countries the question arises; are the threshold scores the same in different countries and should they be?  

The threshold score for treatment integrity of Multisytem Therapy (MST), a youth care intervention spread across many countries, is set at .61 following American norm scores. Therapists in the Netherlands do not always manage to reach this threshold score. There are very many ways of trying to explain this. Let’s just say that one of them is now considered to be that Dutch people that score the treatment integrity of therapists on the instrument of MST (the TAM-R) tend to do this differently to the Americans. This consideration implies that there is a country/culture difference in how people score on a Likert scale ranging from ‘not at all’ to ‘very much’, which is also commonly used in other treatment integrity instruments.

Without putting the Dutch away as passive, I do think they are not very likely to score everything in the extremes on a Likert scale. We have the saying ‘just act normal, that is weird enough’ for a reason. The Americans in that respect might just be more likely to score extreme. They either totally love or totally hate things, which is my totally not scientific impression. I am therefore very interested in the outcomes of the study on country/cultural differences in the scoring of the TAM-R. If it is found to be true that there are significant differences, then in my opinion the set threshold score should not be used in the Netherlands. Not to forget that we might also want to reconsider the international comparisons of treatment integrity scores that have been made up to that point, as they could be improper.

Question being: Is comparing treatment integrity scores across countries equal to comparing apples with pears on the notion that they are both fruit?

                                                        fruit-300x209

Arriving Down Under

30 hours in a plane, I could have read some of those articles I’ve been wanting to read, rewrite the introduction of the article I’m working on or at least analysed some implementation models. But no, all I did was sleep and eat food out of plastic bowls. I realize, I am blessed with this ability to cope in planes. So, with no jet lag, I am now down under in Australia and will be here for the next two months.

I am here to meet experts in the implementation field, deliver presentations about my research as a PhD student in sustainable implementation of youth care interventions, attend a course and a conference and continue working on some of my articles. Why do all this 16,5 thousand kilometres from home? Methods of and experiences with implementation in European countries differ from each other. Through networks, such as the European Implementation Collaborative, I have the possibility to exchange with experts of these countries on the topic of implementation. By visiting Australia, I have the unique opportunity to take a look in the kitchen outside Europe, gain knowledge and share the knowledge that we have available.

So what will I have in store to share with you about my upcoming time in Melbourne? First of all I will meet with Bianca Albers. Bianca is a Principal Implementation Specialist at the Parenting Research Centre in Melbourne, Development Consultant at the Family & Evidence Center in Copenhagen and is pursuing a PhD in the Department of Social Work at the University of Melbourne with a focus on leadership in the implementation of evidence-based practice and programs. I can’t wait to hear about her thoughts and experience in regards to implementation.

Bianca has arranged for me to present at the University of Melbourne together with Prof. Aron Shlonsky, Dr. Robyn Mildon and Bianca herself. The seminar will be about Creating effective support systems for establishing and maintaining treatment integrity and will be held the 13th of August. I am looking forward to the panel discussion and excited to see what topics and questions will come up.

Last but not least, I will attend the short course Implementation Science:Translating evidence into practice held at the University of Melbourne. In the course we will discuss the normalisation process theory and evaluation methods like cluster randomized trials and step wedge design.

Enough interesting meetings to share on this blog. Of course if you have questions you would like me to ask the experts here or topics to put forward, don’t hesitate to let me know!

Are all swans white…until you see a black one? Welcome to my Australia blog!

Swans