Self-Supervised AI Overcomes a Significant Limitation in Clinical AI Design

Self-Supervised AI Overcomes a Significant Limitation in Clinical AI Design
share Share

A group of biomedical informaticists and computer scientists, from the universities of Harvard and Stanford, developed a new machine learning system that can detect diseases on chest X-rays, and requires no human annotations to learn.

 

AI models need to be trained using relevant imaging data in order to learn to detect disease presented in medical images. However, this is often an expensive process and clinicians require a significant amount of time to annotate images. For example, to label a chest X-ray dataset, expert radiologists must examine and explicitly annotate hundreds of thousands of X-ray images, and label each one with the conditions detected.

 

The model, known as CheXzero, eliminates the time and cost hurdles for AI developers as it can effectively skip the image labelling process.

 

Instead, the new model is self-supervised. It can learn independently to detect diseases on chest X-rays by relying on clinical reports, without the need for hand-labeled data.

 

As Pranav Rajpurkar, PhD, assistant professor of biomedical informatics in the Blavatnik Institute at HMS, explains, “with CheXzero, one can simply feed the model a chest X-ray and corresponding radiology report, and it will learn that the image and the text in the report should be considered as similar—in other words, it learns to match chest X-rays with their accompanying report”.

 

Until recently, AI models have relied on the annotation of significant amounts of data to achieve a high performance. Now that the next generation of medical AI models are able to learn from text independently, clinical workflows will become more efficient.

 

Their model was tested in a study published by Nature Biomedical Engineering. Its performance was tested against three other self-supervised AI tools and outperformed them.

 

The researchers are hopeful that this approach could be applied to imaging modalities beyond X-rays, such as CT scans, MRIs, and echocardiograms.

 

Source: Harvard Medical School


Image Credit: iStock

«« AI-Related Competencies Outlined for Health Professionals


Dynamic Brain Imaging With AI »»

References:

 

Tiu E et al. (2022) Expert-level detection of pathologies from unannotated chest X-ray images via self-supervised learning. Nature Biomedical Engineering. 


Published on : Tue, 4 Oct 2022



Related Articles

AI is transforming the world and it already has become a large part of the everyday activities in radiology. However, it is critical... Read more

LumineticsCore™ (Formerly IDx-DR)’s New Look Adds a Human Touch to Technology   Digital Diagnostics – a leading AI diagnostics... Read more

Artificial intelligence (AI) can revolutionise medical publishing. The advantages of AI can be observed in three key areas: content,... Read more

chest x-rays, Artifical Intelligence, Nature Biomedical Engineering A group of biomedical informaticists and computer scientists, from the universities of Harvard and Stanford, developed a new machine learning system that can detect diseases on chest X-rays, and requires no human annotations to learn

No comment


Please login to leave a comment...