CXRMate: Leveraging Longitudinal Data and a Semantic Similarity Reward for Chest X-Ray Report Generation
Paper: https://doi.org/10.1016/j.imu.2024.101585, arXiv: https://arxiv.org/abs/2307.09758
@article{NICOLSON2024101585,
title = {Longitudinal data and a semantic similarity reward for chest X-ray report generation},
journal = {Informatics in Medicine Unlocked},
volume = {50},
pages = {101585},
year = {2024},
issn = {2352-9148},
doi = {https://doi.org/10.1016/j.imu.2024.101585},
url = {https://www.sciencedirect.com/science/article/pii/S2352914824001424},
author = {Aaron Nicolson and Jason Dowling and Douglas Anderson and Bevan Koopman},
}
CXRMate is a longitudinal, multi-image CXR report generation encoder-to-decoder model that conditions the report generation process on the report from the previous patient's study if available. The CXRMate checkpoint trained on MIMIC-CXR is available on the Hugging Face Hub: https://huggingface.co/aehrc/cxrmate.
Generated reports for the single-image, multi-image, and longitudinal, multi-image CXR generators (both prompted with the radiologist and the generated reports) are located in the generated_reports directory.
-
Longitudinal, multi-image CXR report generation with SCST & CXR-BERT reward and generated previous reports: https://huggingface.co/aehrc/cxrmate
-
Longitudinal, multi-image CXR report generation with SCST & CXR-BERT reward and radiologist previous reports: https://huggingface.co/aehrc/cxrmate-tf
-
Longitudinal, multi-image CXR report generation with TF: https://huggingface.co/aehrc/cxrmate-tf
-
Multi-image CXR report generation with TF: https://huggingface.co/aehrc/cxrmate-multi-tf
-
Single-image CXR report generation with TF: https://huggingface.co/aehrc/cxrmate-single-tf
SCST: Self-Critical Sequence Training, TF: Teacher Forcing
Notebook examples for the models can be found in the examples directory.
- The MIMIC-CXR-JPG dataset is available at:
https://physionet.org/content/mimic-cxr-jpg/2.0.0/
After cloning the repository, install the required packages in a virtual environment.
The required packages are located in requirements.txt:
python -m venv --system-site-packages venv
source venv/bin/activate
python -m pip install --upgrade pip
python -m pip install --upgrade -r requirements.txt --no-cache-dirThe model configurations for each task can be found in its config directory, e.g. config/test_huggingface_longitudinal_gen_prompt_cxr-bert.yaml. To run testing:
dlhpcstarter -t cxrmate_hf -c config/test_huggingface/longitudinal_gen_prompt_cxr-bert.yaml --stages_module tools.stages --testSee dlhpcstarter==0.1.4 for more options.
Note:
- Data will be saved in the experiment directory (
exp_dirin the configuration file). - See https://github.com/MIT-LCP/mimic-cxr/tree/master/txt to extract the sections from the reports.
To train with teacher forcing:
dlhpcstarter -t cxrmate -c config/train/longitudinal_gt_prompt_tf.yaml --stages_module tools.stages --train
The model can then be tested with the --test flag:
dlhpcstarter -t cxrmate -c config/train/longitudinal_gt_prompt_tf.yaml --stages_module tools.stages --test
To then train with Self-Critical Sequence Training (SCST) with the CXR-BERT reward:
- Copy the path to the checkpoint from the
exp_dirfor the configuration above, then paste it in the configuration for SCST aswarm_start_ckpt_path, then: -
dlhpcstarter -t mimic_cxr -c config/train/longitudinal_gen_prompt_cxr-bert.yaml --stages_module tools.stages --train
Note:
- See
dlhpcstarter==0.1.4for more options. - See https://github.com/MIT-LCP/mimic-cxr/tree/master/txt to extract the sections from the reports.
If you need help, or if there are any issues, please leave an issue and we will get back to you as soon as possible.