Radiology residency programs
traditionally assess residents using tools with a focus on Medical Expert CanMeds
roles through Objective Structured Clinical Exams
(OSCE's), mock-oral exams and multiple choice in-service examinations. These all are performed typically by using a single, or limited number of static images presented in isolation from the clinical setting. In a study published in Academic Radiology,
researchers designed an Emergency Radiology Simulator
aimed to rate the span of competencies required across Medical
and NonMedical Expert
Researchers determine resident performance using a simulated Emergency Radiology environment, which includes their ability to prioritise, follow protocol, determine pathology findings by reviewing complete cases, create preliminary reports and complete a hand-over document at the end of an exam.
The simulated environment mirrors the activity typically performed during a resident's after hours on-call shift. The performance on the simulation was examined to see whether resident performance in the Emergency Radiology setting in terms of presenting the expected abilities in Medical Expert and NonMedical Expert roles such as Communicator, Health Advocate and Leader.
The results from these evaluations often do not represent a resident's competence, or lack thereof, within the clinical practice. As training is transitioning to competency-based residency education, it is becoming more and more important that training programs obtain the means to adequately and accurately assess the resident's clinical competence in a way that represents what their real-world performance would be. Artificial representation of real-world processes, or more commonly known as simulations, can help reach this objective.
Material and Methods
An online simulator with typical emergency cases was administered in October 2015 to Post Graduate Year (PGY) 2–5 residents in Radiology. Residents provided preliminary reports, which were graded for style and content. The simulation also included prioritisation, protocoling, counselling, and handover exercises geared to assess NonMedical Expert roles.
Forty-eight residents participated in the simulation. Level of resident was 11 PGY-2, 17 PGY-3, 13 PGY-4, and 7 PGY-5. There was a significant difference in resident performance between PGY-2 residents and those more senior in terms of the Medical Expert role (findings, diagnosis, recommendations, and clinical relevance of reports). Differences in performance between PGY levels were not seen in the NonMedical Expert roles (prioritisation, protocoling, counselling, and handover).
The research team finding suggest simulation provides an opportunity to assess radiology resident performance across multiple domains. PGY-2 residents performed worse on the Medical Expert domains, although performance did not significantly vary between the other years. This may suggest that competence in Emergency Radiology is achieved early in residency, possibly related to the importance placed on developing skills related to on-call performance during the PGY-2 year. The simulator should be extended to other areas of Radiology, in order to assess the ability to discriminate performance in other subspecialties.