Skip to content

Conversation

@robinnarsinghranabhat
Copy link

Problem

It fees like docs should be explicit to tell users to add openai: prefix when using ollama embedding models.

I got following error for another model :

OPENAI_API_KEY="ollama" \
OPENAI_API_BASE="http://localhost:11434/v1" \
DOCS_MCP_EMBEDDING_MODEL="embeddinggemma:latest" \
npx @arabold/docs-mcp-server@latest
⚠️  No credentials found for embeddinggemma embedding provider. Vector search is disabled.
   Only full-text search will be available. To enable vector search, please configure the required
   environment variables for embeddinggemma or choose a different provider.
   See README.md for configuration options or run with --help for more details.
🚀 Grounded Docs available at http://127.0.0.1:6280
   • Web interface: http://127.0.0.1:6280
   • MCP endpoints: http://127.0.0.1:6280/mcp, http://127.0.0.1:6280/sse
   • Embeddings: disabled (full text search only)

Fix

DOCS_MCP_EMBEDDING_MODEL="openai:embeddinggemma:latest" \

Similar Issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant