Chest X-rays are the routine 'go-to' test used as the first step in medical protocols to help diagnose multiple issues affecting the lungs, heart, bones and soft tissues. Chest X-rays comprise close to 50 percent of all diagnostic medical imaging performed globally. Consequently, the enormity of volume of these exams creates significant backlogs at healthcare provider facilities and very long wait times for patients who in many cases may be delayed as long as 30 days to get a report. Researchers have created and artificial intelligence system that minimised the time required to less that 3 days compared to the average 12 days currently experienced in practice, according to a new study published in the journal Radiology
The researchers from WMG at the University of Warwick developed an artificial intelligence (AI) system using deep-learning algorithms that can successfully identify and prioritise chest X-rays that present abnormal critical findings. The system may be the key to dramatically cutting down the time required to make sure that abnormal chest X-rays containing critical findings get an expert radiologist opinion quickly, reducing the average wait time from 11 days to less than 3 days. This will potentially eliminate the backlog of exams in many facilities and help provide urgent care to patients that need it faster, according to the study published in the journal Radiology.
The research team, led by Professor Giovanni Montana
, Chair in Data Science in WMG at the University of Warwick, found that normal chest radiographs were detected with a positive predicted value of 73% and a negative predicted value of 99%, and at a speed that meant that abnormal radiographs with critical findings could be prioritised to receive an expert radiologist opinion much sooner than the usual practice.
"Artificial intelligence led reporting of imaging could be a valuable tool to improve department workflow and workforce efficiency. The increasing clinical demands on radiology departments worldwide has challenged current service delivery models, particularly in publicly-funded healthcare systems.
It is no longer feasible for many Radiology departments with their current staffing level to report all acquired plain radiographs in a timely manner, leading to large backlogs of unreported studies. In the United Kingdom, it is estimated that at any time there are over 300,000 radiographs waiting over 30 days for reporting.
The results of this research shows that alternative models of care, such as computer vision algorithms, could be used to greatly reduce delays in the process of identifying and acting on abnormal X-rays - particularly for chest radiographs which account for 40% of all diagnostic imaging performed worldwide. The application of these technologies also extends to many other imaging modalities including MRI and CT" said in a statement Prof. Giovanni Montana.
The AI system successfully differentiated abnormal from normal chest X-rays with high accuracy. The simulations performed revealed that critical findings received an expert radiologist opinion on an average of 2.7 days, using the AI system approach- performing significantly faster than the 11.2-day average for the current actual practice.
"Currently there are no systematic and automated ways to triage chest X-rays and bring those with critical and urgent findings to the top of the reporting pile," said in a statement study co-author Dr. Giovanni Montana, formerly of King's College London and currently at the University of Warwick in Coventry, England.
The team led by co-author of the study Dr. Montana, from WMG at the University of Warwick
, working with Guy's and St Thomas' NHS Hospitals, applied a dataset extracted from 470,388 adult chest X-rays
to develop the AI
system with the capability to identify radiological abnormalities in the X-rays in real-time and classify how quickly these exams should be reported on by a radiologist. The images used were then stripped of all identifying and sensitive information to protect patients privacy.
The researchers built the AI system by developing and validating a Natural Language Processing (NLP) algorithm- an important element of the AI system that extracts labels from written text- with the ability to read a radiological report, understand the findings mentioned by the reporting radiologist, and automatically categorise the priority level of the exam. For each X-ray, the researchers' in-house system required a list of labels depicting which specific abnormalities were apparent on each image. The researchers then applied this algorithm to the historical exams to generate a significant volume of training exams to teach the AI system to understand and learn which visual patterns in X-rays were indicative of their urgency level.
"The NLP goes well beyond pattern matching," said Dr. Montana. "It uses AI techniques to infer the structure of each written sentence; for instance, it identifies the presence of clinical findings and body locations and their relationships. The development of the NLP system for labelling chest X-rays at scale was a critical milestone in our study."
The NLP analysed the radiologic reports to prioritise images as critical, urgent, non-urgent or normal. An AI system for computer vision using deep learning was then trained through the labeled X-ray images to determine the clinical priority from appearances only. The researchers then tested the system's performance for prioritisation in a simulation using an independent set of 15,887 images.
"The initial results reported here are exciting as they demonstrate that an AI system can be successfully trained using a very large database of routinely acquired radiologic data," Dr. Montana said. "With further clinical validation, this technology is expected to reduce a radiologist's workload by a significant amount by detecting all the normal exams so more time can be spent on those requiring more attention."
The researchers report that they plan to expand their research to incorporate a much larger sample size and deploy more complex algorithms to further enhance the systems' performance. Their future research goals include a multi-centre study to prospectively assess the performance of the triaging software.
"A major milestone for this research will consist in the automated generation of sentences describing the radiologic abnormalities seen in the images," Dr. Montana said in a statement published. "This seems an achievable objective given the current AI technology."
The research team noted that the historical radiographs in the dataset they applied were formally reported by one of 276 different reporters including board-certified radiologists, trainee radiologists and accredited reporting radiographers. The reports and images used in this study were anonymised prior to modelling and therefore did not reveal any referral information or patient-identifying data.
Image credit: iStock