Skip to content

AIlab-RITEH/RadiologyNET-TL-models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RadiologyNET Logo            RITEH AI Lab Logo

RadiologyNET Foundation Models

Welcome to the official repository of RadiologyNET foundation models.

RadiologyNET is a large-scale, pseudo-labelled medical imaging dataset, comprising over 1.9 million DICOM-derived images spanning various anatomical regions and imaging modalities (MR, CT, CR, RF, XA). The dataset was used to pretrain several widely used neural network architectures, which were then evaluated across multiple downstream tasks. The models were pretrained using PyTorch.

These pretrained models are made publicly available to support further research and development in medical transfer learning.

Contents:

  1. Evaluation and Findings
  2. Usage
  3. Download
  4. Tested Challenges
  5. Notes & Limitations
  6. Citation

Evaluation and Findings

RadiologyNET models were benchmarked against ImageNet-pretrained and randomly initialised (Baseline) models on five publicly available medical datasets. Key findings:

  • When training resources were unrestricted, RadiologyNET and ImageNet models achieved comparable performance.
  • RadiologyNET showed advantages under resource-limited conditions (e.g., early training stages, small datasets).
  • Multi-modality pretraining generally yielded better generalisation than single-modality alternatives, but this depended on the intra-domain variability of each modality. Where a single modality was sufficiently diverse, there was no significant benefit from incorporating other modalities into the pretraining dataset.
  • High-quality manual labelling (e.g. RadImageNet) remains the gold standard.

Brain Tumor MRI Fig.1: Validation performance during the first 10 epochs on Brain Tumor MRI. Averaged across five runs.

COVID-19 Fig.2: Validation performance during the first 10 epochs on COVID-19 dataset. Averaged across five runs.


Usage

The Example.ipynb notebook in the ./usage/ directory demonstrates how to load weights into your model. We have prepared a function called transfer_weights_to_model() which searches layer namespace and transfers the weights. The benefits of using this function over just loading the features is that this function will transfer weights even if the network topologies do not match exactly, or if the desired outcome is to load the weights partially (for example, loading ResNet50 into a U-Net-ResNet50 topology).

Loading the RadiologyNET weights can be performed in three steps. For example, if using MobileNetV3Large:

import models

RADIOLOGYNET_WEIGHTS = 'TL_Weights/MobileNetV3Large.pth'
model = models.MobileNetV3Large(pretrained=False, number_of_classes=NUM_CLASSES)
models.transfer_weights_to_model(path=RADIOLOGYNET_WEIGHTS, target_model=model, device='cpu')

Step 1

Download the RadiologyNET weights and place them into the ./usage/TL_Weights/ directory. The download links can be found here.

RADIOLOGYNET_WEIGHTS = 'TL_Weights/MobileNetV3Large.pth'  # path to the downloaded RadiologyNET weights

Step 2

Instantiate your desired model. The ./usage/models.py script contains the topologies tested with RadiologyNET. For example, if loading MobileNetV3Large is desired:

import models  # loads the models.py script found in the ./usage/ directory
model = models.MobileNetV3Large(pretrained=False, number_of_classes=10)  # instantiate MobileNetV3Large from the implementation in models.py

Step 3

Transfer the weights.

models.transfer_weights_to_model(path=RADIOLOGYNET_WEIGHTS, target_model=model, device='cpu')

For more information, refer to the ./usage/ directory.

Download

The models (pytorch weights) are packaged into .tar.gz archives and are available at the following download links:

Model File Size
DenseNet121 74 MiB Download
EfficientNetB3 114 MiB Download
EfficientNetB4 189 MiB Download
InceptionV3 229 MiB Download
MobileNetV3Large 45 MiB Download
MobileNetV3Small 17 MiB Download
ResNet18 117 MiB Download
ResNet34 223 MiB Download
ResNet50 247 MiB Download
VGG16 1.21 GiB Download

Challenges

The following challenges have been used to evaluate the performance of RadiologyNET foundation models. Brain Tumor MRI is described in detail in the ./datasets/ folder, with a MinimalWorkingSample.ipynb available for easier reproducibility.

Notes & Limitations

  • RadiologyNET labels were generated using an unsupervised clustering of image, DICOM, and diagnosis text features.
  • The current version is based on a single-institution dataset. Contributions of multi-centre datasets may be included in future iterations.
  • For more details, we refer the reader to the full publication published in Scientific Reports.

Citation

If you use these models, please cite (BibTeX):

@article{Napravnik2025,
  title = {Lessons learned from RadiologyNET foundation models for transfer learning in medical radiology},
  volume = {15},
  ISSN = {2045-2322},
  url = {http://dx.doi.org/10.1038/s41598-025-05009-w},
  DOI = {10.1038/s41598-025-05009-w},
  number = {1},
  journal = {Scientific Reports},
  publisher = {Springer Science and Business Media LLC},
  author = {Napravnik,  Mateja and Hržić,  Franko and Urschler,  Martin and Miletić,  Damir and Štajduhar,  Ivan},
  year = {2025},
  month = jul 
}

@article{Napravnik2024,
  title = {Building RadiologyNET: an unsupervised approach to annotating a large-scale multimodal medical database},
  volume = {17},
  ISSN = {1756-0381},
  url = {http://dx.doi.org/10.1186/s13040-024-00373-1},
  DOI = {10.1186/s13040-024-00373-1},
  number = {1},
  journal = {BioData Mining},
  publisher = {Springer Science and Business Media LLC},
  author = {Napravnik,  Mateja and Hržić,  Franko and Tschauner,  Sebastian and Štajduhar,  Ivan},
  year = {2024},
  month = jul 
}

📄 Reference:
Napravnik M, Hržić F, Urschler M, Miletić D, Štajduhar I.
Lessons learned from RadiologyNET foundation models for transfer learning in medical radiology.
Scientific Reports 15, 21622 (2025).
https://doi.org/10.1038/s41598-025-05009-w

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published