HealthManagement, Volume 15 - Issue 2, 2015

ECRI’s Top 10 Health Hazards 2015 (ECRI 2014) lists at no. 7 “dose creep”, which is a pattern of radiation exposure levels (ie, dose) being increased by clinicians over time in an attempt to achieve better image quality in diagnostic radiography. Although it is unlikely to result in immediate harm, it’s an insidious problem that can have long-term consequences and that, over time, can affect many patients. Fortunately, tools are now becoming available to help healthcare facilities combat this hazard. 


In many ways dose creep is an unintended consequence of the progress from film to the use of digital detectors in diagnostic radiography. 


With any imaging technology that uses ionising radiation, exposures to higher doses are associated with greater risks to the patient (eg, an increased long-term risk of developing cancer). Thus, standard practice specifies that technologists use a dose that is “as low as reasonably achievable” (ALARA) to acquire the desired diagnostic information. In other words, the dose should be neither higher nor lower than is necessary to obtain a diagnostic quality image. 


In film-based radiography, exposing the patient to radiation levels that were too high or too low carried a built-in penalty: The resulting film would be unusable (either overexposed or underexposed). Thus, wide variations from the optimal exposure parameters would be noticed. 


Digital detectors, by comparison, are more forgiving. Because they have a much wider dynamic range than film, they can tolerate a significantly wider range of exposure parameters and still return a usable image. One advantage of this wider dynamic range is that it reduces the likelihood that an imaging exam will need to be repeated—which would expose the patient to additional radiation—if a higher- or lower-than optimal exposure is used. 


One downside, however, is that the wider dynamic range creates an environment in which radiographic technologists can adjust exposure parameters away from the recommended levels—sometimes making changes little by little over time— without there being an obvious indication of the change. That is, deviating from the recommended exposure would not typically be evident by looking at the resulting digital image. 


In fact, with digital detectors, the quality of the image generally improves as the dose increases. Thus, there is a natural tendency to nudge the dose higher to get better-quality images. Repeated adjustments in this manner over time can lead to the use of exposure factors that vary substantially from the “usual” exposures for a given study, without users being aware that dose levels have crept upward. The consequence is that patients may routinely be exposed to unnecessarily high levels of ionising radiation during exams. While any increase in dose for a single exam is likely to have a negligible effect, the cumulative effect on patients subjected to multiple studies during the course of their treatment— particularly neonatal patients—can become significant.


With digital imaging, the only objective way to identify whether the optimal exposure factors are being used consistently (ie, for all studies or in all care areas) is to review the exposure indicators provided by the imaging system. Previously, the practice of comparing exposure indicators across imaging systems or care areas was complicated by the lack of a standardised approach: Each imaging system manufacturer defined its own numerical indication of the radiographic exposure to estimate the dose delivered to the detector. Now, however, manufacturers are increasingly adopting the standardised exposure index (EI), established by International Electrotechnical Commission (IEC) standard 62494-1. This means healthcare facilities can begin using the EI (on appropriately equipped systems) to track the exposure factors that are used and to identify trends that might indicate variation from the optimal values. 


Newer imaging systems are now beginning to incorporate EI capabilities. And for existing digital radiography systems, it may be possible to add this capability through a software upgrade. In addition, software tools are becoming available to facilitate the tracking of EI values. To make effective use of the EI, radiology managers, possibly in consultation with medical physicists, will need to define acceptable values for specific studies and patient types, track the variation, and find ways to efficiently identify poor practice.


 


Recommendations


• If your digital diagnostic radiography systems are not already equipped to use the standardised EI—as developed by the International Electrotechnical Commission(IEC 62494-1) and the American Association of Physicists in Medicine (AAPM tg-116) and as implemented by device manufacturers—investigate whether a software upgrade is available to add this capability. For new equipment purchases, incorporate EI capabilities into your request for proposal.


• After it has been incorporated into your imaging systems, use the EI to estimate the patient dose and exposure on the detector.


• take the steps necessary to display EI values to radiographic technologists aspart of their routine workflow. this may require a software upgrade or configurationchange.


• Install software tools that automatically import and analyse EI data.


• Define responsibilities for tracking and analysing the EI data for the whole department.


• Work toward defining acceptable EI values and ranges for commonly performed radiography studies.


Source: ECRI (2014) Top 10 health technology hazards. Health Devices, November. Available from:


https://www.ecri.org/Resources/Whitepapers_and_reports/Top_Ten_Technology_Hazards_2015.pdf