According to a new study, radiologist quality assurance (QA) conducted in line with tumour board workflow can enable efficient assessment of radiologist performance. The findings reveal that discordant interpretations are commonly (30%) reported by nonradiologist providers.

The study, published in Journal of the American College of Radiology, says "QA conducted in line with existing tumour boards allows assessment of radiologist performance while minimising biases intrinsic to unblended peer-to-peer evaluations. The multidisciplinary nature of tumour boards probably improves the determination of what discordances are 'clinically significant' and bypasses issues intrinsic to intradepartmental politics." 

Traditionally, radiologist QA has not involved members outside the radiology department. However, given the central role nonradiologist providers play in therapeutic decision making, and their potentially lesser interest in internal radiology politics, such providers may be better positioned to assess radiologist performance. Nonradiologist providers staffing tumour boards are well positioned to identify clinically significant discordances in radiologist reporting because they are actively using the imaging information to inform immediate management decisions.

There is a growing body of literature suggesting that radiologist discordance is underreported by random retrospective peer review and that more robust QA processes are needed. In the current study, researchers hypothesised that nonradiologists could be used to efficiently reduce bias in the assessment of radiologist performance. A hepatobiliary tumour board was chosen because of the known interrater variability for major findings of hepatocellular carcinoma and the important role that imaging plays in therapeutic decision making for patients with liver tumours.

Institutional review board approval was obtained for this HIPAA-compliant prospective QA effort. Consecutive patients with CT or MR imaging reviewed at one hepatobiliary tumour board between February 2016 and October 2016 (n = 265) were included. All presentations were assigned prospective anonymous QA scores by an experienced nonradiologist hepatobiliary provider based on contemporaneous comparison of the imaging interpretation at a tumour board and the original interpretation(s): concordant, minor discordance, major discordance. Major discordance was defined as a discrepancy that may affect clinical management. Minor discordance was defined as a discrepancy unlikely to affect clinical management. All discordances and predicted management changes were retrospectively confirmed by the liver tumour programme medical director. Logistic regression analyses were performed to determine what factors best predict discordant reporting.

Data analyses showed that about one-third (30% [79 of 265]) of reports were assigned a discordance, including 51 (19%) minor and 28 (11%) major discordances. The most common related to mass size (41% [32 of 79]), tumour stage and extent (24% [19 of 79]), and assigned LI-RADS v2014 score (22% [17 of 79]). One radiologist had 11.8-fold greater odds of discordance (P = .002). Nine other radiologists were similar (P = .10-.99). Radiologists presenting their own studies had 4.5-fold less odds of discordance (P = .006).

Commonly, QA processes assume that the second reader is the reference standard and the initial reader is the object under study. According to researchers, the majority of discordances observed in this study were deemed to be minor and to not affect patient management. 

"However, a substantial minority (8% [21 of 265]) were considered to affect patient management, and therefore deeply exploring who, or, more importantly what, was correct (the initial or final interpretation) is probably warranted," the researchers noted.   


«« RSNA 2019: PSMA-based PET Radiotracers Transforming Care in Prostate Cancer


New 3D imaging technique could lead to improved arthritis treatment »»



Latest Articles

Radiologists, nonradiologists, discordant interpretations According to a new study, radiologist quality assurance (QA) conducted in line with tumour board workflow can enable efficient assessment of radiologist performance. The findings reveal that discordant interpretations are commonly (30%) reported by nonrad