AI to enhance, not replace, radiologists' jobs
Rapid advances in artificial intelligence, including deep learning, are a major factor behind improvements in medical image analysis. Some deep learning models have shown their ability to identify pathologies in radiological images such as bone fractures and potentially cancerous lesions, in some cases more reliably than an average radiologist.
With AI increasingly playing an important role in radiology, this has led to speculation that AI might one day replace human radiologists. Some students entering medical schools are reportedly having second thoughts about specialising in radiology because they fear the job will cease to exist. While there are substantial benefits — in terms of efficiency and productivity — to be gained from integrating AI with radiological practice, this does not mean the role of radiologist will become redundant, according to an article in Harvard Business Review.
One of the authors, Keith J. Dreyer, is a radiologist and artificial intelligence researcher, and the other, Thomas H. Davenport, an IT professor, has researched the impact of AI on jobs for several years. They say that the great majority of radiologists will continue to have jobs in the decades to come — jobs that will be altered and enhanced by AI. "As one blog post put it, the only radiologists whose jobs may be threatened are the ones who refuse to work with AI," the authors write.
The authors highlight several reasons why radiologists won’t be disappearing from the labour force, despite the increasing use of AI in radiology:
• Radiologists do more than read and interpret images. Like other AI systems, radiology AI systems perform single tasks (narrow AI). Deep learning models are trained for specific image recognition tasks (such as nodule detection on chest CT or haemorrhage on brain MRI). "But thousands of such narrow detection tasks are necessary to fully identify all potential findings in medical images, and only a few of these can be done by AI today," the authors say. Furthermore, the job of image interpretation encompasses only one set of tasks that radiologists perform. They also consult with other physicians on diagnoses and treatment, treat diseases (for example providing local ablative therapies), perform image-guided medical interventions (interventional radiology), discuss procedures and results with patients, and many other activities.
• Clinical processes for employing AI-based image work are a long way from being ready for daily use. Different imaging technology vendors and deep learning algorithms are focused on different aspects of the use cases they address, according to the authors. Even among deep learning-based nodule detectors that are approved by the FDA, there were different foci: the probability of a lesion, the probability of cancer, a nodule’s feature or its location. "These distinct foci would make it very difficult to embed deep learning systems into current clinical practice," the authors note.
• Deep learning algorithms for image recognition must be trained on “labelled data." In radiology, this means images from patients who have received a definitive diagnosis of cancer, a broken bone, or other pathology. In other types of image recognition where deep learning has achieved high levels of success, it has been trained on millions of labelled images, such as cat photos on the internet. "But there is no aggregated repository of radiology images, labelled or otherwise. They are owned by vendors, hospitals and physicians, imaging facilities, and patients, and collecting and labelling them to accumulate a critical mass for AI training will be challenging and time-consuming," the authors explain.
• Changes will be required in medical regulation and health insurance for automated image analysis to take off. Who’s responsible, for example, if a machine misdiagnoses a cancer case — the physician, the hospital, the imaging technology vendor, or the data scientist who created the algorithm? And will health care payers reimburse for an AI diagnosis as a single set of eyes, or as a second set in combination with a human radiologist? "All these issues need to be worked out, and it’s unlikely that progress will happen as fast as deep learning research in the lab does," the authors say.
Source: Harvard Business Review
Image Credit: Pixabay
Published on : Mon, 2 Apr 2018
Better detection. Clinically superior. Low dose.** What if we could find breast cancers earlier? See lesions more clearly? Reduce the number of unnecessary biopsies? Questions like these inspired Hologic researchers and scientists to develop Hologic...
The SonoSite SII features a new touchscreen user interface with a clinician-driven menu logic that adaptively adjusts to your imaging needs – “what you need, is what you see”. An embedded dual transducer connector allows you to quickly switch between...
About the Affirm® breast biopsy guidance system Our passion to offer innovative interventional tools that advance breast health fueled the development of the Affirm breast biopsy guidance system. This simple add-on to any Hologic 3D Mammography™-capable...
RDM: the DACS for building a Low-Dose Culture RDM is a software solution for collecting, controlling and analyzing radiation doses delivered to patients during medical imaging examinations. It helps improve clinical practices and optimize doses....
Expertly designed to optimize productivity ARIETTA 65 has been designed to perform quick and precise diagnosis in general imaging without compromising on productivity and workflow. This ultrasound platform combines productivity, enhanced tools...