Health is a complex interaction requiring complex science techniques, say Ana Fernandez and colleagues from the University of Sydney and the University of Newcastle, Australia. They argue that randomised controlled trials (RCTs) can be useful in the discovery phase, but inadequate in the field of implementation. Implementing change in healthcare needs to take into account not only the evidence, but clinical expert knowledge, the context and, not least, patients’ values.
In their review of the development of EBM from they note the irony that the scientific community adopted EBM on the basis of authoritative knowledge of experts such as David Sackett, not on evidence. They outline the growth in promotion of EBM in the medical journals JAMA and the BMJ while noting the critical stance of The Lancet in relation to EBM.
They suggest that EBM has benefited from the Matthew effect, which refers to excessive cross-citation among its proponents to raise the credibility of a viewpoint. This led to an ‘invisible college’, which they define as “a group of scientists or professionals who may live in separate locations, but attend the same conferences, publish in the same journals, and invite each other to give keynote lectures to share the same ideas.” The Cochrane Collaboration has morphed into a ‘visible college’, they suggest.
ICU Management spoke to Prof. Luis Salvador-Carulla (pictured), professor of Disability and Mental Health at the Faculty of Health Sciences and Head of the Mental Health Policy Unit at the Brain and Mind Centre (University of Sydney, Australia) about the team’s paper. When asked if the pendulum has swung too far in the direction of doctrinaire EBM, Prof. Salvador-Carulla noted that this is the main point. He explained: "In a metaphoric way the traditional hierarchical pyramid of the scientific knowledge with RCTs at the top commonly used for grading evidence in EBM should be replaced by a “Greek temple” where the columns represent the different sources of scientific knowledge that provide different and meaningful contributions to the overall knowledge base."
The paper arose from the work of the discussion group “Dialogues on Complexity and Health Systems” (DOCAHS) at the University of Sydney ,which was set up in 2013 in order to address key issues in the analysis of health care within the complex adaptive systems (CAS) framework. Prof. Salvador-Carulla explained that as discussions progressed, it was clear that the EBM model could hardly be considered as the only or even the main source of scientific knowledge to understand, guide and model health systems. Previous outputs included a conceptual paper on frames in healthcare and one on the framing of scientific knowledge, which set up the ground to analyse the caveats of EBM as a “paradigm” of scientific knowledge in healthcare.
The paper’s authors note that in recent years even hard core EBM researchers have argued for a rebirth of the movement, in particular in relation to shared decision making with patients and expert judgement. High quality evidence does not always imply strong recommendations, a recognition the author comments. Indeed, strong recommendations can arise from low quality evidence. Assessment of quality of evidence needs to be separate from the strength of recommendations.
The writers declare that “it is important to consider that scientific knowledge is divided in three major areas: discovery, corroboration and implementation, and that the information of one domain cannot be directly applied to the others.” Prof. Salvador-Carulla noted that the developers of EBM assumed that improving the knowledge-base of discovery/corroboration by using systematic reviews and meta-analysis would be automatically translated into implementation.
- Discovery and corroboration - here the important criterion is internal validity. Instrument - RCTs aspiring to remove variability.
- Implementation - the important criterion is the degree of external validity of the results applicability to the local context and acceptability of the intervention/s to the patient. Instrument - pragmatic controlled trials, including randomised controlled trials. These embrace variability as the norm.
The authors suggest that the emphasis on internal validity has “contributed to the failure of EBM, as recommendations - being based on experimental designs where variables and confounders are controlled (RCTs) - often fail to be translatable into practice because the research context does not reflect real world clinical practice/ reality. Of course, RCTs are not set up to evaluate real world outcomes. Pragmatic controlled trials that are conducted under usual conditions offer practitioners considerable freedom in deciding how to apply the intervention to be tested.
They conclude: “Most likely, EBM grew too fat to effectively incorporate its original propositions: evidence, expert knowledge and patients’ preferences.” Evidence is context sensitive and both global and local evidence need to be combined in developing usable recommendations for clinical decision making. They recommend that local evidence is combined with expert knowledge. Local evidence includes modifying factors in specific settings, magnitude of needs, patient values, costs and resource availability. Expert knowledge differs from expert opinion, as it is the implicit knowledge that professionals have that helps them to better understand local conditions.
Information coming from explanatory RCTs has to be complemented and contrasted with infrmation coming from prgagmatic RCTs evaluating effectiveness in routine practice. This implies some loss of ‘internal validity’ and an increase in the uncertainty of the results, but ‘gains in representativeness.’
Prof. Salvador-Carulla told ICU Management: “The massive influence of the EBM paradigm in the development of clinical guidelines and in fixing the standards of good practice has to be revised. It is not the case to replace EBM, but to contextualise it, and to understand that other sources of evidence (eg, observational data, cohort studies, large longitudinal surveys, administrative big data) play a very relevant role in implementation In addition other sources of scientific knowledge apart from evidence, such as expert knowledge, play a fundamental role in innovation, implementation and practical use of the medical knowledge-base. . ndeed, EBM was about implementing research to practice, so external validity should be prioritised. Consequently, we need to discuss if RCTs are the best designs if the important principle is external validity.”
The paper in Health Research Policy and Systems is one paper in a whole series being developed by Dialogues on Complexity and Health Systems” (DOCAHS) at the University of Sydney on relevant issues related to complexity and health systems.
Managing Editor, ICU Management