Pragmatic Radiology Report Generation
November 28, 2023 ยท Entered Twilight ยท ๐ ML4H@NeurIPS
Repo contents: CXRMetric, README.md, eval_requirements.txt, evaluate.py, finetune.py, finetune.sh, format_llama_input.py, image_model, pragmatic_llama_inference.py, prompts, report_cleaning.py, requirements.txt, utils_finetune.py
Authors
Dang Nguyen, Chacha Chen, He He, Chenhao Tan
arXiv ID
2311.17154
Category
cs.CL: Computation & Language
Cross-listed
cs.AI,
cs.CY,
cs.LG
Citations
11
Venue
ML4H@NeurIPS
Repository
https://github.com/ChicagoHAI/llm_radiology
โญ 9
Last Checked
2 months ago
Abstract
When pneumonia is not found on a chest X-ray, should the report describe this negative observation or omit it? We argue that this question cannot be answered from the X-ray alone and requires a pragmatic perspective, which captures the communicative goal that radiology reports serve between radiologists and patients. However, the standard image-to-text formulation for radiology report generation fails to incorporate such pragmatic intents. Following this pragmatic perspective, we demonstrate that the indication, which describes why a patient comes for an X-ray, drives the mentions of negative observations and introduce indications as additional input to report generation. With respect to the output, we develop a framework to identify uninferable information from the image as a source of model hallucinations, and limit them by cleaning groundtruth reports. Finally, we use indications and cleaned groundtruth reports to develop pragmatic models, and show that they outperform existing methods not only in new pragmatics-inspired metrics (+4.3 Negative F1) but also in standard metrics (+6.3 Positive F1 and +11.0 BLEU-2).
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted