Self-hosted Claude Code workspace with multi-provider routing, sandboxed execution, and a full web IDE.
Join the Discord server.
- Claude Code as the execution harness, exposed through a self-hosted web UI
- One workflow across Anthropic, OpenAI, GitHub Copilot, OpenRouter, and custom Anthropic-compatible endpoints
- Anthropic Bridge routing for non-Anthropic providers while preserving Claude Code behavior
- Isolated sandbox backends (Docker, E2B, Modal, host)
- Extension surface: MCP servers, skills, agents, slash commands, prompts, and marketplace plugins
- Provider switching with shared working context
React/Vite Frontend
-> FastAPI Backend
-> PostgreSQL + Redis (web/docker mode)
-> SQLite + in-memory cache/pubsub (desktop mode)
-> Sandbox runtime (Docker/E2B/Modal/Host)
-> Claude Code CLI + claude-agent-sdk
Claudex runs chats through claude-agent-sdk, which drives the Claude Code CLI in the selected sandbox. This keeps Claude Code-native behavior for tools, session flow, permission modes, and MCP orchestration.
For OpenAI, OpenRouter, and Copilot providers, Claudex starts anthropic-bridge inside the sandbox and routes Claude Code requests through:
ANTHROPIC_BASE_URL=http://127.0.0.1:3456- provider-specific auth secrets such as
OPENROUTER_API_KEYandGITHUB_COPILOT_TOKEN - provider-scoped model IDs like
openai/gpt-5.2-codex,openrouter/moonshotai/kimi-k2.5,copilot/gpt-5.2-codex
Claudex UI
-> Claude Agent SDK + Claude Code CLI
-> Anthropic-compatible request shape
-> Anthropic Bridge (OpenAI/OpenRouter/Copilot)
-> Target provider model
For Anthropic providers, Claudex uses your Claude auth token directly. For custom providers, Claudex calls your configured Anthropic-compatible base_url.
- Claude Code-native chat execution through
claude-agent-sdk - Anthropic Bridge provider routing with provider-scoped models (
openai/*,openrouter/*,copilot/*) - Multi-sandbox runtime (Docker/E2B/Modal/Host)
- MCP + custom skills/agents/commands + plugin marketplace
- Checkpoint restore and chat forking from any prior message state
- Streaming architecture with resumable SSE events and explicit cancellation
- Built-in recurring task scheduler (in-process async, no worker service)
- Docker + Docker Compose
git clone https://github.com/Mng-dev-ai/claudex.git
cd claudex
docker compose -p claudex-web -f docker-compose.yml up -dOpen http://localhost:3000.
docker compose -p claudex-web -f docker-compose.yml down
docker compose -p claudex-web -f docker-compose.yml logs -fDesktop mode uses Tauri with a bundled Python backend sidecar on localhost:8081, with local SQLite storage.
- Apple Silicon DMG: Claudex_0.1.0_aarch64.dmg
When running in desktop mode:
- Tauri hosts the frontend in a native macOS window
- the sidecar backend process serves the API on
8081 - desktop uses local SQLite plus in-memory cache/pubsub (no Postgres/Redis dependency required for desktop mode)
Tauri Desktop App
-> React frontend (.env.desktop)
-> bundled backend sidecar (localhost:8081)
-> local SQLite database
Requirements:
- Node.js
- Rust
Dev workflow:
cd frontend
npm install
npm run desktop:devBuild (unsigned dev):
cd frontend
npm run desktop:buildApp bundle output:
frontend/src-tauri/target/release/bundle/macos/Claudex.app
Desktop troubleshooting:
- Backend unavailable: wait for sidecar startup to finish
- Database errors: verify local app data directory permissions
- Port conflict: free port
8081if already in use
Configure providers in Settings -> Providers.
anthropic: paste token fromclaude setup-tokenopenai: authenticate with OpenAI device flow in UIcopilot: authenticate with GitHub device flow in UIopenrouter: add OpenRouter API key and model IDscustom: set Anthropic-compatiblebase_url, token, and model IDs
- OpenAI/Codex:
gpt-5.2-codex,gpt-5.2,gpt-5.3-codex - OpenRouter catalog examples:
moonshotai/kimi-k2.5,minimax/minimax-m2.1,google/gemini-3-pro-preview - Custom gateways: models like
GLM-5,M2.5, or private org-specific endpoints (depends on your backend compatibility)
Switching providers does not require a new workflow:
- Same sandbox filesystem/workdir
- Same
.clauderesources (skills, agents, commands) - Same MCP configuration in Claudex
- Same chat-level execution flow
This is the main value of using Claude Code as the harness while changing inference providers behind Anthropic Bridge.
- Frontend:
3000 - Backend API:
8080 - PostgreSQL:
5432 - Redis:
6379 - VNC:
5900 - VNC Web:
6080 - OpenVSCode server:
8765
- API docs: http://localhost:8080/api/v1/docs
- Admin panel: http://localhost:8080/admin
- Liveness endpoint:
GET /health - Readiness endpoint:
GET /api/v1/readyz- web mode checks database + Redis
- desktop mode checks database (SQLite) only
- VPS/Coolify guide: docs/coolify-installation-guide.md
- Production setup uses frontend at
/and API under/api/*
- Frontend: React 19, TypeScript, Vite, TailwindCSS, Zustand, React Query
- Backend: FastAPI, SQLAlchemy, Redis, PostgreSQL/SQLite, Granian
- Runtime: Claude Code CLI, claude-agent-sdk, anthropic-bridge
Apache 2.0. See LICENSE.
- Anthropic Claude Code SDK: docs.anthropic.com/s/claude-code-sdk
- Anthropic Bridge package: pypi.org/project/anthropic-bridge
- OpenAI Codex CLI sign-in: help.openai.com/en/articles/11381614
- OpenRouter API keys: openrouter.ai/docs/api-keys
- GitHub Copilot plans: github.com/features/copilot/plans

