A JAX replication of generative thermodynamic computing12, which uses Langevin dynamics for visual generation. This repository provides MNIST digit synthesis as a minimal working example.
Standard diffusion models use neural networks for denoising. Here, denoising is done by Langevin dynamics of a physical system with trained couplings — no neural network at inference time.
Training maximizes the probability that the system generates the reverse of noising trajectories, which is equivalent to minimizing heat dissipation. In hardware, this could be
Noising (image → noise):
Denoising & generation (noise → image):
This project uses uv for environment management.
uv syncVerify the JAX backend:
uv run python -c "import jax; print(jax.__version__); print(jax.default_backend())"Download MNIST:
uv run download-mnist # saves to data/mnist.npz
uv run download-mnist --out path/to/mnist.npz # custom pathTrain a model and generate figures:
uv run whitelam-2026 --mnist data/mnist.npz --out outputs/demoBy default, the model trains on digits (0, 1, 2) with 512 hidden units, matching the paper's setup.
Override any config field with --set key=value (repeatable). Values are parsed as JSON.
# More training trajectories for better samples
uv run whitelam-2026 --mnist data/mnist.npz --out outputs/demo \
--set n_training_trajectories=1000
# Set random seed for reproducibility (default is 42)
uv run whitelam-2026 --mnist data/mnist.npz --out outputs/demo --seed 123
# Higher DPI for publication-quality figures
uv run whitelam-2026 --mnist data/mnist.npz --out outputs/demo --set fig_dpi=600The default model (512 hidden units) works well for 3 digit classes. To train on all digits (0-9), increase model capacity proportionally:
uv run whitelam-2026 --mnist data/mnist.npz --out outputs/full_mnist \
--set n_h=2048 \
--set 'train_digits=[0,1,2,3,4,5,6,7,8,9]' \
--set n_training_trajectories=1000Scaling considerations:
- Training time scales with
n_h²(hidden-hidden couplings) - Memory scales with
n_v × n_h(visible-hidden couplings)
To regenerate figures from saved parameters without retraining:
uv run whitelam-2026 \
--mnist data/mnist.npz \
--params outputs/demo/params_learned.npz \
--out outputs/demo_rerenderEach run produces:
| File | Description |
|---|---|
fig1.png, fig2.png |
Composite figures |
fig1a_noising.png |
Noising trajectory |
fig1b_training_set.png |
Training digits |
fig2a_denoising_trajectories.png |
Denoising trajectories |
fig2b_samples.png |
Generated samples |
fig2c_receptive_fields.png |
Learned hidden unit couplings |
params_learned.npz |
Trained model parameters |
config.json |
Configuration used |
metrics.json |
Heat dissipation metrics |
Run tests:
uv run pytest -vUse as a library:
from generative_langevin.whitelam_2026.config import Whitelam2026Config
from generative_langevin.whitelam_2026.model import init_params
from generative_langevin.whitelam_2026.train import train_many_noising_trajectories
from generative_langevin.whitelam_2026.sample import run_denoising_trajectoryFootnotes
-
S. Whitelam. Generative Thermodynamic Computing. Physical Review Letters 136(3):037101, 2026. https://doi.org/10.1103/kwyy-1xln ↩
-
https://github.com/swhitelam/generative_thermodynamic_computing ↩

