Editorial

Mind the Gap – Ideas for Making Clinical Research More Relevant for Practitioners and Patients

Max Berg*1, Lea Schemer2, Lukas Kirchner1, Saskia Scholten2

Clinical Psychology in Europe, 2024, Vol. 6(1), Article e12419, https://doi.org/10.32872/cpe.12419

Received: 2023-07-17. Accepted: 2024-03-06. Published (VoR): 2024-03-28.

Handling Editor: Cornelia Weise, Philipps-University of Marburg, Marburg, Germany

*Corresponding author at: University of Marburg, Gutenbergstraße 18, 35032 Marburg, Germany. E-mail: max.berg@uni-marburg.de

This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Randomized controlled trials (RCTs) are widely considered to be the gold standard for demonstrating efficacy in psychotherapy research. However, the clinical utility of “typical” RCTs for establishing routine care therapies has been a topic of long-standing debate in our field (Persons & Silberschatz, 1998). “Typical” refers to a study with a small to moderate sample size that targets a disorder according to a standardized diagnostic manual and is often waiting-list controlled (Carey & Stiles, 2016). Practitioners frequently express criticism about the external validity of such RCTs (Gyani et al., 2015; Safran et al., 2011). In qualitative investigations, therapists describe the “unrepresentativeness of RCTs” as a reason for why they do not regard clinical research as an important foundation for their everyday decision making (Gyani et al., 2015). A review suggested that the perceived “inflexibility” of manuals could also be related to the lack of interest of many practitioners and that therapists wonder whether the “standardized instructions” provided in them are useful for their heterogeneous clinical use cases (Speers et al., 2022).

Similarly, from a methodological point of view, the inference to intra-individual variability from group-level research was challenged (Fisher et al., 2018). Furthermore, the substantial heterogeneity in treatment effects suggests that even if patients with the same diagnoses are treated with the same treatment by the same therapist, they respond differently (Herzog & Kaiser, 2022). Given the methodological challenges and the skepticism of therapists, we argue that the criticism regarding clinical science should be taken seriously. In this editorial, we present five ideas for improving psychotherapy research and for addressing the research practice gap.

Five Ideas for Psychotherapy Research

Idea One: Focus on Transdiagnostic Mechanisms

Diagnoses in clinical psychology typically do not present homogeneous entities, and comorbidity rates between “different” disorders are commonly high (Rief et al., 2023). For example, different patients exhibit largely heterogeneous symptom dynamics in depression and novel clinical research is starting to acknowledge this (Fried et al., 2023). Also, we know from a large body of research that pathological mechanisms are not limited to a single disorder, but oftentimes pose transdiagnostic problems (Dalgleish et al., 2020). Transdiagnostic mechanisms include (but are not limited to), dysfunctional expectations with aberrant belief updating (Kirchner et al., 2022), social impairments (Lehmann et al., 2019), and reward insensitivity and its interplay with stress dysregulation (Martin-Soelch, 2023). Given the potential of transdiagnostic mechanisms, it seems worthwhile to allocate treatment based on them rather than solely based on diagnoses. For example, patients who exhibit a high tendency for repetitive negative thinking could be assigned to focused therapies that target this mechanism, regardless of whether they have been diagnosed with depression, generalized anxiety disorder, or both.

Idea Two: Dismantle Treatment Protocols

A plethora of therapeutic techniques exist to target (transdiagnostic) mechanisms (Schaeuffele et al., 2021). Yet, we know little about their isolated effect because treatment manuals oftentimes with overlapping strategies – are evaluated as a treatment package. Such “evidence-based black boxes” are effective for treating numerous mental disorders, but their respective effect sizes and response rates remain moderate (Ormel et al., 2022). Future research should dismantle treatment protocols and evaluate the effect of specific techniques. Applying dismantled techniques instead of treatment protocols might be closer to clinical practice anyway, where the implementation of complex procedures is limited due to time, comorbidity patterns, and financial resources. The dismantling of treatment packages may also necessitate a departure from traditional therapy orientations. Competence-oriented frameworks (Rief, 2021), or process-based therapy (Moskow et al., 2023) are two approaches that could promote a more “toolbox oriented” thinking.

Idea Three: Monitor Individual Trajectories With Sufficient Resolution

In clinical research and practice, diagnostic instruments are usually collected at only a few points in time (e.g., before and after treatment). To date, few projects exist that collect intensive longitudinal data (i.e., session-by-session data or ecological momentary assessment) in clinical trials and routine care settings (Lutz et al., 2022). These methods would allow to monitor individual trajectories, compare patients to similar cases and provide computerized treatment suggestions, while reducing therapist biases regarding outcome estimation (Lutz et al., 2022). M-Path and Shiny apps are digital implementations of such efforts (Mestdagh et al., 2023). However, just because appropriate tools are available does not mean they are already being frequently used. Barriers, particularly in terms of usability and knowledge of digital technologies, can make it difficult for clinicians to use digital innovations. Therefore, it is vital that the curricula of psychology students are expanded from science- to practice-oriented use of data literacy and computer science.

Idea Four: Use Causal Inference Methods for Routine Care Data

Large psychopathology data sets exist in routine care, but we need to sample and process them in a way that allows for causal inference. It was suggested that we can develop alternatives to RCTs for estimating the causal effect of a given treatment on an outcome. In addition to established approaches like propensity score matching (Lee & Little, 2017), single-case experimental designs can be utilized as an ideographic alternative for RCTs. These designs utilize an experimental manipulation that compares the individual response of a patient at different time points. (e.g., during treatment delivery versus waiting-periods). Single-case experimental designs have the potential to empower practitioners to become scientist-practitioners of their own clinical practice (Kazdin, 2019).

“Synthetic waitlists” can also bring causal inference into psychotherapy research (Kaiser et al., 2023). Here, machine learning algorithms select patients from waiting lists, based on the multidimensional similarity to a given patient under treatment. This, in turn, allows to estimate the probability that a specific patient would have reached a certain outcome without receiving therapy. If this probability is low, then a significant part of the improvement can be attributed to the treatment. Utilizing synthetic waitlists allow us to harvest routine data as an additional source of information and estimate the effect of therapeutic strategies under realistic, everyday conditions.

Idea Five: Utilize the Expertise of Practitioners and the Lived Experience of Patients

Participatory science actively engages various stakeholders throughout the entire research process (Slattery et al., 2020). For example, patients should be involved as experts in the development of clinically relevant research questions (Birnie et al., 2019), the optimization of treatment manuals (Schemer et al., 2023), and for the planning of upcoming research projects (Slattery et al., 2020). Similarly, practitioners can be involved to facilitate the clinical usefulness of technological advances or to find ways to overcome practical barriers to implement an effective therapeutic strategy. Here, we face the challenge of involving practitioners who are distant or even skeptical about clinical research. Yet, a participatory approach would help to address the research-practice gap by involving groups for whom practically relevant and effective clinical science is in their vital self-interest.

Conclusion

In conclusion, addressing the research-practice gap requires a shift towards dismantling the effect of specific therapeutic techniques on better-operationalized transdiagnostic mechanisms. Monitoring individual trajectories and using innovative methods for inference can provide valuable insights into therapy effectiveness, if needed at an individual level. Finally, an active involvement of non-scientists can create research that is interesting and engaging for different stakeholders.

Funding

The post-doc position of Max Berg was funded by the PsyChange initiative (funding number: 56040018), by the Hessian Ministry of Science and Arts when this editorial was written. This work was also supported by the DYNAMIC center, funded by the LOEWE program of the Hessian Ministry of Science and Arts (grant number: LOEWE1/16/519/03/09.001(0009)/98).

Acknowledgments

The PsyChange initiative by the Hessian Ministry of Science and Arts organized think-tanks, symposia, and expert interviews with the goal of conducting meta-science to improve psychological treatments. This work was endorsed and made possible by the initiative and Max Berg wants to thank all PsyChange members for the ongoing support.

Competing Interests

All authors declare no competing interests, financial or otherwise. The ideas expressed in this editorial are solely based on the consensual personal opinions of the authors.

Preprint Disclosure

A preprint can be found at the PsyArXiv server: https://doi.org/10.31234/osf.io/2qvhy

References

  • Birnie, K. A., Dib, K., Ouellette, C., Dib, M. A., Nelson, K., Pahtayken, D., Baerg, K., Chorney, J., Forgeron, P., Lamontagne, C., Noel, M., Poulin, P., & Stinson, J. (2019). Partnering for pain: A priority setting partnership to identify patient-oriented research priorities for pediatric chronic pain in Canada. Canadian Medical Association Journal Open, 7(4), E654-E664. https://doi.org/10.9778/cmajo.20190060

  • Carey, T. A., & Stiles, W. B. (2016). Some problems with randomized controlled trials and some viable alternatives. Clinical Psychology & Psychotherapy, 23(1), 87-95. https://doi.org/10.1002/cpp.1942

  • Dalgleish, T., Black, M., Johnston, D., & Bevan, A. (2020). Transdiagnostic approaches to mental health problems: Current status and future directions. Journal of Consulting and Clinical Psychology, 88(3), 179-195. https://doi.org/10.1037/ccp0000482

  • Fisher, A. J., Medaglia, J. D., & Jeronimus, B. F. (2018). Lack of group-to-individual generalizability is a threat to human subjects research. Proceedings of the National Academy of Sciences of the United States of America, 115(27), E6106-E6115. https://doi.org/10.1073/pnas.1711978115

  • Fried, E. I., Proppert, R. K. K., & Rieble, C. L. (2023). Building an early warning system for depression: Rationale, objectives, and methods of the WARN-D study. Clinical Psychology in Europe, 5(3), Article e10075. https://doi.org/10.32872/cpe.10075

  • Gyani, A., Shafran, R., Rose, S., & Lee, M. J. (2015). A qualitative investigation of therapists’ attitudes towards research: Horses for courses? Behavioural and Cognitive Psychotherapy, 43(4), 436-448. https://doi.org/10.1017/S1352465813001069

  • Herzog, P., & Kaiser, T. (2022). Is it worth it to personalize the treatment of PTSD? – A variance-ratio meta-analysis and estimation of treatment effect heterogeneity in RCTs of PTSD. Journal of Anxiety Disorders, 91, Article 102611. https://doi.org/10.1016/j.janxdis.2022.102611

  • Kaiser, T., Brakemeier, E. L., & Herzog, P. (2023). What if we wait? Using synthetic waiting lists to estimate treatment effects in routine outcome data. Psychotherapy Research, 33(8), 1043-1057. https://doi.org/10.1080/10503307.2023.2182241

  • Kazdin, A. E. (2019). Single-case experimental designs: Evaluating interventions in research and clinical practice. Behaviour Research and Therapy, 117, 3-17. https://doi.org/10.1016/j.brat.2018.11.015

  • Kirchner, L., Eckert, A., Berg, M., Endres, D., Straube, B., & Rief, W. (2022). Better safe than sorry? An active inference approach to biased social inference in depression. PsyArXiv Preprints. https://doi.org/10.31234/osf.io/bp9re

  • Lee, J., & Little, T. D. (2017). A practical guide to propensity score analysis for applied clinical research. Behaviour Research and Therapy, 98, 76-90. https://doi.org/10.1016/j.brat.2017.01.005

  • Lehmann, K., Maliske, L., Böckler, A., & Kanske, P. (2019). Social impairments in mental disorders: Recent developments in studying the mechanisms of interactive behavior. Clinical Psychology in Europe, 1(2), Article e33143. https://doi.org/10.32872/cpe.v1i2.33143

  • Lutz, W., Schwartz, B., & Delgadillo, J. (2022). Measurement-based and data-informed psychological therapy. Annual Review of Clinical Psychology, 18(1), 71-98. https://doi.org/10.1146/annurev-clinpsy-071720-014821

  • Martin-Soelch, C. (2023). The (neuro)-science behind resilience: A focus on stress and reward. Clinical Psychology in Europe, 5(1), Article e11567. https://doi.org/10.32872/cpe.11567

  • Mestdagh, M., Verdonck, S., Piot, M., Niemeijer, K., Kilani, G., Tuerlinckx, F., Kuppens, P., & Dejonckheere, E. (2023). m-Path: An easy-to-use and highly tailorable platform for ecological momentary assessment and intervention in behavioral research and clinical practice [Technology and code]. Frontiers in Digital Health, 5, Article 1182175. https://doi.org/10.3389/fdgth.2023.1182175

  • Moskow, D. M., Ong, C. W., Hayes, S. C., & Hofmann, S. G. (2023). Process-based therapy: A personalized approach to treatment. Journal of Experimental Psychopathology, 14(1), Article 20438087231152848. https://doi.org/10.1177/20438087231152848

  • Ormel, J., Hollon, S. D., Kessler, R. C., Cuijpers, P., & Monroe, S. M. (2022). More treatment but no less depression: The treatment-prevalence paradox. Clinical Psychology Review, 91, Article 102111. https://doi.org/10.1016/j.cpr.2021.102111

  • Persons, J. B., & Silberschatz, G. (1998). Are results of randomized controlled trials useful to psychotherapists? Journal of Consulting and Clinical Psychology, 66(1), 126-135. https://doi.org/10.1037/0022-006X.66.1.126

  • Rief, W. (2021). Moving from tradition-based to competence-based psychotherapy. BMJ Mental Health, 24(3), 115-120. https://doi.org/10.1136/ebmental-2020-300219

  • Rief, W., Hofmann, S. G., Berg, M., Forbes, M. K., Pizzagalli, D. A., Zimmermann, J., Fried, E., & Reed, G. M. (2023). Do we need a novel framework for classifying psychopathology? A discussion paper. Clinical Psychology in Europe, 5(4), Article e11699. https://doi.org/10.32872/cpe.11699

  • Safran, J. D., Abreu, I., Ogilvie, J., & DeMaria, A. (2011). Does psychotherapy research influence the clinical practice of researcher–clinicians? Clinical Psychology: Science and Practice, 18(4), 357-371. https://doi.org/10.1111/j.1468-2850.2011.01267.x

  • Schaeuffele, C., Schulz, A., Knaevelsrud, C., Renneberg, B., & Boettcher, J. (2021). CBT at the crossroads: The rise of transdiagnostic treatments. International Journal of Cognitive Therapy, 14(1), 86-113. https://doi.org/10.1007/s41811-020-00095-2

  • Schemer, L., Hess, C. W., Van Orden, A. R., Birnie, K. A., Harrison, L. E., Glombiewski, J. A., & Simons, L. E. (2023). Enhancing exposure treatment for youths with chronic pain: Co-design and qualitative approach. Journal of Participatory Medicine, 15, Article e41292. https://doi.org/10.2196/41292

  • Slattery, P., Saeri, A. K., & Bragge, P. (2020). Research co-design in health: A rapid overview of reviews. Health Research Policy and Systems, 18(1), Article 17. https://doi.org/10.1186/s12961-020-0528-9

  • Speers, A. J. H., Bhullar, N., Cosh, S., & Wootton, B. M. (2022). Correlates of therapist drift in psychological practice: A systematic review of therapist characteristics. Clinical Psychology Review, 93, Article 102132. https://doi.org/10.1016/j.cpr.2022.102132