Automatically sending clinical data algorithmically generated from the preliminary radiology report has the potential to assist radiologists’ interpretation, improve communication with referring physicians and benefit patient care, according to Anna Ellermeier, MD, radiology resident at Alpert Medical School of Brown University. Ellermeir presented a report on testing of automated delivery of clinical and laboratory follow up data from the preliminary report at this week’s Radiological Society of North America (RSNA) Annual Scientific Meeting in Chicago.

The RadPath system uses an algorithm within to automatically email interpreting radiologists with follow-up data on pathology, cytology, endoscopy and operative notes. The system has a speech recognition macro using three unique characters, as part of their procedure-related voice recognition templates. I It can be used at any time by the radiologists to request follow-up on a non procedure-related report (i.e. an intentional request). The institutional interface engineer detects special characters from the radiology report and  extract that information. The system stores a copy of the radiology report and is programmed to “listen” for the data crossing interface, It compares the radiology report MRN to other data interfaces - radiology and cytology (CoPath) and surgery and endoscopy (SoftMed). If a match is detected, it retains a copy of the follow-up report, and sends a Health Insurance Portability and Accountability Act (HIPAA)-compliant email to the radiologist who initiated the search.

The information is sent via the secure institution email system only.
Since the system was initiated in April 2012, nearly 14000 emails have been sent.

The study evaluated the type and utility of follow-up data sent in RadPath emails. Radiology residents manually assessed one month of emails to see if the email content was relevant to the original dictation. Did the follow-up email answer the question posed by the request? If it did, it was considered concordant, if not, discordant. For example, an email with an OR report for removal of inflamed appendix would be considered concordant for a radiology report on acute appendicitis, but a cytology report for thyroid biopsy would be deemed discordant. An email might contain more than one follow-up report, and would be considered concordant if any report answered the posed question.

Of 268 unique radiology reports that generated emails, the concordance rate was 92% (246/268). There was, however, higher concordance for automatic request (96%) than for intentional requests. 418 total messages were sent, with a concordance rate of 90% (378/418). Concordance of the emails varied by category of content from 94% for pathology and 91% for cytology to 75% for endoscopy. The limitations are irrelevant follow-up emails and occasional duplicate emails.  

Ellermeir concluded that this kind of automated delivery of high-yield clinical follow-up information could be extended easily. In radiology, for example, an MRI report might include follow-up of an indeterminate lesion by CT. It has application in other disciplines also. A cardiologist doing echo for valve disease would receive OR findings, for example. it also offers efficient communication with referring physicians and ultimately better patient care.

Claire Pillar
Managing Editor, HealthManagement



Latest Articles

RSNA 2014, #RSNA14, Voice recognition, Radiology reports Automatically sending clinical data algorithmically generated from the preliminary radiology report has the potential to assist radiologists’ interpretat...