CoCoLIT: ControlNet-Conditioned Latent Image Translation for MRI to Amyloid PET Synthesis
Alec Sargood*,
Lemuel Puglisi*,
James Cole,
Neil Oxtoby,
Daniele Ravì†,
Daniel C. Alexander†
* Joint first authors,
† Joint senior authors
- 2025-10: Our paper CoCoLIT has been accepted at AAAI 2026 (17% acceptance rate).
This repository requires Python 3.10 and PyTorch 2.0 or later. To install the latest version, run:
pip install cocolitAfter installing the package, you can convert a T1-weighted MRI to a Florbetapir SUVR map by running:
mri2pet --i /path/to/t1.nii.gz --o /path/to/output.nii.gzTo replicate the results presented in the paper, include the --m 64 flag.
Note
If you wish to compare your MRI-to-PET model against CoCoLIT but prefer not to retrain our models due to limited compute or time, please email us and we will provide the exact training splits used in this work to enable benchmarking with our pretrained models.
To reproduce the experiments reported in the paper, please follow the reproducibility guide.
This software is not intended for clinical use. The code is not available for commercial applications. For commercial inquiries, please contact the corresponding authors.
Arxiv Preprint:
@article{sargood2025cocolit,
title={CoCoLIT: ControlNet-Conditioned Latent Image Translation for MRI to Amyloid PET Synthesis},
author={Sargood, Alec and Puglisi, Lemuel and Cole, James H and Oxtoby, Neil P and Rav{\`\i}, Daniele and Alexander, Daniel C},
journal={arXiv preprint arXiv:2508.01292},
year={2025}
}For any inquiries, please contact:
- Alec Sargood: alec.sargood.23@ucl.ac.uk
- Lemuel Puglisi: lemuel.puglisi@phd.unict.it
