Skip to content

fliingelephant/GenerativeThermodynamicComputing

Repository files navigation

A JAX replication of generative thermodynamic computing12, which uses Langevin dynamics for visual generation. This repository provides MNIST digit synthesis as a minimal working example.

How it works

Standard diffusion models use neural networks for denoising. Here, denoising is done by Langevin dynamics of a physical system with trained couplings — no neural network at inference time.

Training maximizes the probability that the system generates the reverse of noising trajectories, which is equivalent to minimizing heat dissipation. In hardware, this could be $\gt 10^{10}\times$ more efficient than digital computation.

Noising (image → noise):

Noising trajectory

Denoising & generation (noise → image):

Denoising and samples

Installation

This project uses uv for environment management.

uv sync

Verify the JAX backend:

uv run python -c "import jax; print(jax.__version__); print(jax.default_backend())"

Data

Download MNIST:

uv run download-mnist            # saves to data/mnist.npz
uv run download-mnist --out path/to/mnist.npz  # custom path

Quick Start

Train a model and generate figures:

uv run whitelam-2026 --mnist data/mnist.npz --out outputs/demo

By default, the model trains on digits (0, 1, 2) with 512 hidden units, matching the paper's setup.

Configuration

Override any config field with --set key=value (repeatable). Values are parsed as JSON.

# More training trajectories for better samples
uv run whitelam-2026 --mnist data/mnist.npz --out outputs/demo \
  --set n_training_trajectories=1000

# Set random seed for reproducibility (default is 42)
uv run whitelam-2026 --mnist data/mnist.npz --out outputs/demo --seed 123

# Higher DPI for publication-quality figures
uv run whitelam-2026 --mnist data/mnist.npz --out outputs/demo --set fig_dpi=600

Scaling to All 10 Digits

The default model (512 hidden units) works well for 3 digit classes. To train on all digits (0-9), increase model capacity proportionally:

uv run whitelam-2026 --mnist data/mnist.npz --out outputs/full_mnist \
  --set n_h=2048 \
  --set 'train_digits=[0,1,2,3,4,5,6,7,8,9]' \
  --set n_training_trajectories=1000

Scaling considerations:

  • Training time scales with n_h² (hidden-hidden couplings)
  • Memory scales with n_v × n_h (visible-hidden couplings)

Re-rendering Figures

To regenerate figures from saved parameters without retraining:

uv run whitelam-2026 \
  --mnist data/mnist.npz \
  --params outputs/demo/params_learned.npz \
  --out outputs/demo_rerender

Outputs

Each run produces:

File Description
fig1.png, fig2.png Composite figures
fig1a_noising.png Noising trajectory
fig1b_training_set.png Training digits
fig2a_denoising_trajectories.png Denoising trajectories
fig2b_samples.png Generated samples
fig2c_receptive_fields.png Learned hidden unit couplings
params_learned.npz Trained model parameters
config.json Configuration used
metrics.json Heat dissipation metrics

Development

Run tests:

uv run pytest -v

Use as a library:

from generative_langevin.whitelam_2026.config import Whitelam2026Config
from generative_langevin.whitelam_2026.model import init_params
from generative_langevin.whitelam_2026.train import train_many_noising_trajectories
from generative_langevin.whitelam_2026.sample import run_denoising_trajectory

Reference

Footnotes

  1. S. Whitelam. Generative Thermodynamic Computing. Physical Review Letters 136(3):037101, 2026. https://doi.org/10.1103/kwyy-1xln

  2. https://github.com/swhitelam/generative_thermodynamic_computing

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages