Instead of this approach, a simpler alternate approach may be setting OPENAI_API_BASE_URLS if running
OpenWebUI via Docker
# docker-compose.yml
environment:
- OPENAI_API_BASE_URLS=https://api.openai.com/v1;http://model-runner.docker.internal/engines/v1
see #1 for more
This is a reverse proxy for the Docker Model Runner API so that OpenWebUI can access models installed via Docker Model Runner.
The primary purpose was to add CORS support, but it also fixes a few other inconsistencies that may or may not be needed with further testing and/or configuration.
- Currently only lightly tested on OSX 14.1 Sonoma with Docker Desktop (4.40.0) and OpenWebUI (v0.6.5)
- Docker && Docker Desktop installed and running
- Docker Compose
- Docker Model Runner extension
- OpenWebUI installed and running
Make sure Enable host-side TCP support is checked in Docker settings
If you haven't already:
# install a model via Docker Model Runner
docker model pull ai/gemma3# clone this repo
git clone git@github.com:kilgore5/docker-model-runner-api-proxy.git
cd docker-model-runner-api-proxy
# start the proxy
docker-compose up -d# see models installed via Docker Model Runner
$ curl -s http://localhost:8081/v1/models
{"object":"list","data":[{"id":"ai/gemma3","object":"model","created":1742979452,"owned_by":"docker"}]}add http://localhost:8081/v1 to the OpenWebUI Settings -> Connections -> Manage Direct Connections