A recent study tested whether artificial intelligence could pass the rapid reporting element of the Fellowship of the Royal College of Radiologists exam, which radiologists in the UK are required to pass before the completion of their training.
The aim of the study was to test whether an AI candidate could pass the reporting exam and outperform human radiologists taking the exam. To pass the quick reporting examination, 30 radiographs must be interpreted and 90% of these must be correctly reported. Out of the 10 tests, the AI scored over 90%on only two of the tests. However, the study cited this was a challenging set of rapid reporting examinations, with the average human radiologists passing four out of the ten mock examinations.
Despite the results, the AI candidate had a relatively high accuracy, considering the complexity of the case. On average, the AI had overall a sensitivity rate of 83.6%, a specificity rate of 75.2%, and an overall accuracy rate of 79.5%.
The AI candidate was, however, correct in its diagnosis in 50% of the cases that radiologists failed, especially in diagnosing hands and feet. This was perhaps because these radiographs included more bones and joints, which humans found tedious to evaluate.
However, as the researchers added, “the artificial intelligence candidate would still need further training to achieve the same level of performance and skill of an average recently FRCR qualified radiologist, particularly in the identification of subtle musculoskeletal abnormalities”.
Overall, regarding the cases that were correctly interpreted, the AI performed on a close level to a radiologist. However, further training is advised for the cases the AI candidate regarded non-interpretable, including abdominal radiographs and those of the axial skeleton.
Image Credit: iStock