CheXzero: Self-learning AI that detects pathologies from unannotated X-rays

Researchers have developed CheXzero, a self-learning AI model, that can learn and detect abnormalities from unlabeled chest x-ray images. Know more here.
CheXzero self-learning AI model

The use of AI in medical imaging interpretation has widely increased in current times as AI helps in faster interpretation and reduces radiologist burnout. However, training an AI model to assist radiologists is a daunting task as the datasets used for training need to be labelled with annotations. This further increases the burden on radiologists and clinicians.

To address this issue, researchers have developed a self-learning AI model called CheXzero that learns from unannotated chest x-rays and builds upon them. With results on par with radiologist interpretation, experts believe it to be a significant advancement in clinical AI design. 

Let’s learn more about the CheXzero model.

What is CheXzero? 

CheXzero is a self-learning AI model developed by researchers at Harvard Medical School and Stanford University that detects abnormalities in chest x-rays. The AI model learns from natural language descriptions of chest x-rays without the need for labelled data. 

A recent study published by researchers in Nature Biomedical Engineering showed the CheXzero model to be as effective as human radiologists at spotting abnormalities on chest x-rays.

CheXzero development, training and testing

CheXzero- self learning AI model for chest x-ray interpretation
Image Source: https://doi.org/10.1038/s41551-022-00936-9

The CheXzero model was trained on publicly available datasets—more than 227,000 clinical notes and 377,000 chest x-rays. It was tested against two distinct datasets of chest x-rays and associated notes gathered from two different institutions. These datasets were partly collected to prevent bias, as they came from institutions located across several nations. 

Model performance was comparable when exposed to clinical notes with different wording. Because of this diversity, CheXzero has proven to be effective in spotting abnormalities on chest x-rays as accurately as human radiologists.

“We’re living in the early days of the next-generation medical AI models that can perform flexible tasks by directly learning from text. Up until now, most AI models have relied on manual annotation of vast amounts of data—to the tune of 100,000 images—to achieve high performance. Our method needs no such disease-specific annotations.”

-study lead investigator Pranav Rajpurkar, assistant professor of biomedical informatics at the Blavatnik Institute at HMS

The researchers have made the CheXzero model publicly available with the aim of developing self-learning AI models to interpret other types of imaging reports and detect abnormalities. Open sourcing will allow other researchers and startups to build on it and lead to better health outcomes.

How does CheXzero learn?

CheXzero is self-supervised, meaning it can learn independently from unannotated training data. It uses English language notes found in the chest x-ray and the associated radiologist’s report as the input. The model then discovers on its own how to match the chest x-rays with the corresponding input report and figures out which ideas in the unstructured text correspond to the image’s visual patterns.

The need for self-learning AI

Given the upsurge in health issues, shortages of radiologists and extensive workloads on existing radiologists, the need for AI assistance in imaging interpretation has exponentially increased. To train AI models, doctors, clinicians, and radiologists need to examine hundreds of thousands of x-ray images one at a time and individually annotate each one with the conditions diagnosed to label them. 

While modern AI models have attempted to solve this labelling problem by learning from unlabeled data during a ‘pre-training’ stage, they ultimately need fine-tuning on labelled data to attain high performance. This more often than not leads to doctor burnout, increasing the need for self-learning AI models like CheXzero.

Is self-learning AI the future of medical AI?

Self-learning AI models like CheXzero can help with faster interpretation and reduce doctor burnout. CheXzero could recognise diseases that human clinicians had not expressly marked during testing, proving its efficiency and accuracy. Based on the findings of the research, this method may be used with imaging modalities including CT scans, MRIs, and echocardiograms that go beyond X-rays. 

Therefore, researchers believe self-learning AI will be a major part of medical AI in the near future.

Total
0
Shares
Previous Post
Dotplot- DIY breast monitoring tool

Dotplot, DIY breast monitoring tool, wins James Dyson Award

Next Post
Medtronic is spinning off

Medtronic is spinning off its patient monitoring and respiratory intervention units

Related Posts