Docs MCP Server solves the problem of AI hallucinations and outdated knowledge by providing a personal, always-current documentation index for your AI coding assistant. It fetches official docs from websites, GitHub, npm, PyPI, and local files, allowing your AI to query the exact version you are using.
The open-source alternative to Context7, Nia, and Ref.Tools.
- ✅ Up-to-Date Context: Fetches documentation directly from official sources on demand.
- 🎯 Version-Specific: Queries target the exact library versions in your project.
- 💡 Reduces Hallucinations: Grounds LLMs in real documentation.
- 🔒 Private & Local: Runs entirely on your machine; your code never leaves your network.
- 🧩 Broad Compatibility: Works with any MCP-compatible client (Claude, Cline, etc.).
- 📁 Multiple Sources: Index websites, GitHub repositories, local folders, and zip archives.
- 📄 Rich File Support: Processes HTML, Markdown, PDF, Word (.docx), Excel, PowerPoint, and source code.
1. Start the server (requires Node.js 20+):
npx @arabold/docs-mcp-server@latest2. Open the Web UI at http://localhost:6280 to add documentation.
3. Connect your AI client by adding this to your MCP settings (e.g., claude_desktop_config.json):
{
"mcpServers": {
"docs-mcp-server": {
"type": "sse",
"url": "http://localhost:6280/sse"
}
}
}See Connecting Clients for VS Code (Cline, Roo) and other setup options.
Alternative: Run with Docker
docker run --rm \
-v docs-mcp-data:/data \
-v docs-mcp-config:/config \
-p 6280:6280 \
ghcr.io/arabold/docs-mcp-server:latest \
--protocol http --host 0.0.0.0 --port 6280Using an embedding model is optional but dramatically improves search quality by enabling semantic vector search.
Example: Enable OpenAI Embeddings
OPENAI_API_KEY="sk-proj-..." npx @arabold/docs-mcp-server@latestSee Embedding Models for configuring Ollama, Gemini, Azure, and others.
- Installation: Detailed setup guides for Docker, Node.js (npx), and Embedded mode.
- Connecting Clients: How to connect Claude, VS Code (Cline/Roo), and other MCP clients.
- Basic Usage: Using the Web UI, CLI, and scraping local files.
- Configuration: Full reference for config files and environment variables.
- Embedding Models: Configure OpenAI, Ollama, Gemini, and other providers.
- Deployment Modes: Standalone vs. Distributed (Docker Compose).
- Authentication: Securing your server with OAuth2/OIDC.
- Telemetry: Privacy-first usage data collection.
- Architecture: Deep dive into the system design.
We welcome contributions! Please see CONTRIBUTING.md for development guidelines and setup instructions.
This project is licensed under the MIT License. See LICENSE for details.
