Skip to content

Your own Claude Code UI, local/e2b/modal sandbox, in-browser VS Code, terminal, multi-provider support (Anthropic, OpenAI, GitHub Copilot, OpenRouter), custom skills, and MCP servers.

License

Notifications You must be signed in to change notification settings

Mng-dev-ai/claudex

Repository files navigation

Claudex

Self-hosted Claude Code workspace with multi-provider routing, sandboxed execution, and a full web IDE.

License: Apache 2.0 Python 3.13 React 19 FastAPI Discord

Community

Join the Discord server.

Why Claudex

  • Claude Code as the execution harness, exposed through a self-hosted web UI
  • One workflow across Anthropic, OpenAI, GitHub Copilot, OpenRouter, and custom Anthropic-compatible endpoints
  • Anthropic Bridge routing for non-Anthropic providers while preserving Claude Code behavior
  • Isolated sandbox backends (Docker, E2B, Modal, host)
  • Extension surface: MCP servers, skills, agents, slash commands, prompts, and marketplace plugins
  • Provider switching with shared working context

Core Architecture

React/Vite Frontend
  -> FastAPI Backend
  -> PostgreSQL + Redis (web/docker mode)
  -> SQLite + in-memory cache/pubsub (desktop mode)
  -> Sandbox runtime (Docker/E2B/Modal/Host)
  -> Claude Code CLI + claude-agent-sdk

Claude Code harness

Claudex runs chats through claude-agent-sdk, which drives the Claude Code CLI in the selected sandbox. This keeps Claude Code-native behavior for tools, session flow, permission modes, and MCP orchestration.

Anthropic Bridge for non-Anthropic providers

For OpenAI, OpenRouter, and Copilot providers, Claudex starts anthropic-bridge inside the sandbox and routes Claude Code requests through:

  • ANTHROPIC_BASE_URL=http://127.0.0.1:3456
  • provider-specific auth secrets such as OPENROUTER_API_KEY and GITHUB_COPILOT_TOKEN
  • provider-scoped model IDs like openai/gpt-5.2-codex, openrouter/moonshotai/kimi-k2.5, copilot/gpt-5.2-codex
Claudex UI
  -> Claude Agent SDK + Claude Code CLI
  -> Anthropic-compatible request shape
  -> Anthropic Bridge (OpenAI/OpenRouter/Copilot)
  -> Target provider model

For Anthropic providers, Claudex uses your Claude auth token directly. For custom providers, Claudex calls your configured Anthropic-compatible base_url.

Key Features

  • Claude Code-native chat execution through claude-agent-sdk
  • Anthropic Bridge provider routing with provider-scoped models (openai/*, openrouter/*, copilot/*)
  • Multi-sandbox runtime (Docker/E2B/Modal/Host)
  • MCP + custom skills/agents/commands + plugin marketplace
  • Checkpoint restore and chat forking from any prior message state
  • Streaming architecture with resumable SSE events and explicit cancellation
  • Built-in recurring task scheduler (in-process async, no worker service)

Quick Start (Web)

Requirements

  • Docker + Docker Compose

Start

git clone https://github.com/Mng-dev-ai/claudex.git
cd claudex
docker compose -p claudex-web -f docker-compose.yml up -d

Open http://localhost:3000.

Stop and logs

docker compose -p claudex-web -f docker-compose.yml down
docker compose -p claudex-web -f docker-compose.yml logs -f

Desktop (macOS)

Desktop mode uses Tauri with a bundled Python backend sidecar on localhost:8081, with local SQLite storage.

Download prebuilt app

How it works

When running in desktop mode:

  • Tauri hosts the frontend in a native macOS window
  • the sidecar backend process serves the API on 8081
  • desktop uses local SQLite plus in-memory cache/pubsub (no Postgres/Redis dependency required for desktop mode)
Tauri Desktop App
  -> React frontend (.env.desktop)
  -> bundled backend sidecar (localhost:8081)
  -> local SQLite database

Build and run from source

Requirements:

  • Node.js
  • Rust

Dev workflow:

cd frontend
npm install
npm run desktop:dev

Build (unsigned dev):

cd frontend
npm run desktop:build

App bundle output:

  • frontend/src-tauri/target/release/bundle/macos/Claudex.app

Desktop troubleshooting:

  • Backend unavailable: wait for sidecar startup to finish
  • Database errors: verify local app data directory permissions
  • Port conflict: free port 8081 if already in use

Provider Setup

Configure providers in Settings -> Providers.

  • anthropic: paste token from claude setup-token
  • openai: authenticate with OpenAI device flow in UI
  • copilot: authenticate with GitHub device flow in UI
  • openrouter: add OpenRouter API key and model IDs
  • custom: set Anthropic-compatible base_url, token, and model IDs

Model examples

  • OpenAI/Codex: gpt-5.2-codex, gpt-5.2, gpt-5.3-codex
  • OpenRouter catalog examples: moonshotai/kimi-k2.5, minimax/minimax-m2.1, google/gemini-3-pro-preview
  • Custom gateways: models like GLM-5, M2.5, or private org-specific endpoints (depends on your backend compatibility)

Shared Working Context

Switching providers does not require a new workflow:

  • Same sandbox filesystem/workdir
  • Same .claude resources (skills, agents, commands)
  • Same MCP configuration in Claudex
  • Same chat-level execution flow

This is the main value of using Claude Code as the harness while changing inference providers behind Anthropic Bridge.

Services and Ports (Web)

  • Frontend: 3000
  • Backend API: 8080
  • PostgreSQL: 5432
  • Redis: 6379
  • VNC: 5900
  • VNC Web: 6080
  • OpenVSCode server: 8765

API and Admin

Health and Ops

  • Liveness endpoint: GET /health
  • Readiness endpoint: GET /api/v1/readyz
    • web mode checks database + Redis
    • desktop mode checks database (SQLite) only

Deployment

Screenshots

Chat Interface Agent Workflow

Tech Stack

  • Frontend: React 19, TypeScript, Vite, TailwindCSS, Zustand, React Query
  • Backend: FastAPI, SQLAlchemy, Redis, PostgreSQL/SQLite, Granian
  • Runtime: Claude Code CLI, claude-agent-sdk, anthropic-bridge

License

Apache 2.0. See LICENSE.

References

About

Your own Claude Code UI, local/e2b/modal sandbox, in-browser VS Code, terminal, multi-provider support (Anthropic, OpenAI, GitHub Copilot, OpenRouter), custom skills, and MCP servers.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 3

  •  
  •  
  •