Long-term Shared Context Between LLMs
Dex Bridge is an innovative system that captures, stores, and enables semantic search across conversations from multiple AI assistants (ChatGPT, Claude) using a local vector database. It creates a persistent memory layer that allows different LLMs to access your conversation history through the Model Context Protocol (MCP).
[] []
.:[]: .:[]:
.: :[]: :. .: :[]: :.
.: : :[]: : :. .: : :[]: : :.
.: : : :[]: : : :-.___.-: : : :[]: : : :.
_:_:_:_:_:[]:_:_:_:_:_::_:_:_:_ :[]:_:_:_:_:_
^^^^^^^^^^[]^^^^^^^^^^^^^^^^^^^^^[]^^^^^^^^^^
[] []
Every time you start a new conversation with an AI assistant, it has zero context of your previous conversations. Dex Bridge solves this by:
- ๐ Capturing real-time conversations from ChatGPT and Claude
- ๐พ Storing them in a vector database with semantic search capabilities
- ๐ Enabling any LLM to search and access your entire conversation history
- ๐ Bridging the context gap between different AI assistants
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Browser / LLM Clients โ
โ (ChatGPT, Claude, GitHub Copilot) โ
โโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ HTTPS Traffic
โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ MITM Proxy Layer โ
โ (mitmproxy @ localhost:8080) โ
โ โ
โ โโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ capture_req.py โ โ capture_claude.py โ โ
โ โ (ChatGPT) โ โ (Claude.ai) โ โ
โ โโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Intercept & Parse SSE/NDJSON
โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Processing Pipeline โ
โ โ
โ parsed_matches/ merge_conversations.py โ
โ โโ chatgpt.com/ โโโโโโโโโโโโโโบ merged_conversations/ โ
โ โ โโ conv_id__ts.json โโ chatgpt.com/ โ
โ โโ claude.ai/ โ โโ merged.json โ
โ โโ conv_id__ts.json โโ claude.ai/ โ
โโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Merge by conversation_id
โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Vector Database (Qdrant) โ
โ store_chat_message.py โ
โ โ
โ Collection: chat_messages โ
โ โโ Embeddings (OpenAI text-embedding-3-small) โ
โ โโ Metadata (role, timestamp, model, conversation_id) โ
โ โโ Content Hash (deduplication) โ
โโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Semantic Search
โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ MCP Server (access_llm_memory.py) โ
โ โ
โ Tool: search_memory(query, top_k) โ
โ โโ Generate query embedding โ
โ โโ Search Qdrant vector DB โ
โ โโ Return relevant conversation snippets โ
โโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ MCP Protocol
โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ VS Code / AI Clients โ
โ (GitHub Copilot, Claude Desktop, etc.) โ
โ โ
โ Now can access your entire conversation history! โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
- โ ChatGPT (chatgpt.com)
- โ Claude (claude.ai)
- ๐ Extensible architecture for other LLMs
- Handles SSE (Server-Sent Events) streams
- Parses NDJSON (Newline Delimited JSON)
- Extracts text from complex patch events
- Preserves conversation structure and metadata
- SHA-256 content hashing
- Prevents duplicate embeddings
- Efficient storage management
- Vector embeddings via OpenAI's
text-embedding-3-small - Cosine similarity search in Qdrant
- Context-aware retrieval
- Standard Model Context Protocol server
- Easy integration with any MCP-compatible client
- Accessible from VS Code, Claude Desktop, and more
- Python 3.10+
- macOS (for proxy management scripts)
- OpenAI API Key
- Qdrant (local or cloud instance)
- Clone the repository:
git clone https://github.com/Dextron04/dex_bridge.git
cd dex_bridge- Create and activate virtual environment:
python3 -m venv dexenv
source dexenv/bin/activate # On macOS/Linux- Install dependencies:
pip install -r dex_bridge/requirements.txt- Configure environment variables:
cd dex_bridge
cp .env.example .env
# Edit .env and add your OPENAI_API_KEY- Install and start Qdrant:
# Using Docker
docker pull qdrant/qdrant
docker run -p 6333:6333 qdrant/qdrant
# Or install locally from https://qdrant.tech/documentation/quick-start/cd dex_bridge
sudo bash dex_bridge.shThis interactive menu lets you:
- โ Enable/disable system proxy
- ๐ก Start mitmproxy capture
- ๐ View current status
- ๐ Manage SSL certificates
Browse to ChatGPT or Claude and have conversations as usual. Dex Bridge captures everything automatically in the background.
The system automatically:
- Captures streams to
parsed_matches/{provider}/ - Merges by conversation ID to
merged_conversations/{provider}/ - Generates embeddings and stores in Qdrant
You can also manually trigger:
# Merge conversations
python merge_conversations.py
# Store in vector DB
python store_chat_message.pyThe MCP server is configured in .vscode/mcp.json:
{
"servers": {
"memory_mcp": {
"type": "stdio",
"command": "/path/to/dexenv/bin/python3",
"args": ["/path/to/dex_bridge/memory_mcp/access_llm_memory.py"]
}
}
}Now ask any MCP-compatible AI:
"Search my memory for conversations about Python async programming"
- Main control script
- Manages macOS network proxy settings
- Handles SSL certificate installation
- Interactive TUI for system control
capture_req.py: ChatGPT conversation capturecapture_claude.py: Claude conversation capture- Real-time SSE/NDJSON parsing
- Automatic file organization
- Groups conversations by ID
- Extracts user/assistant exchanges
- Creates structured JSON output
- Supports multiple providers
- Generates embeddings via OpenAI API
- Stores vectors in Qdrant
- Implements content-based deduplication
- Preserves rich metadata
- MCP server implementation
search_memory(query, top_k)tool- Returns semantically relevant results
- Includes conversation context
Raw HTTPS Stream โ MITM Capture โ Parse Events โ Extract Text
โ
Parsed JSON
โ
Merge by Conversation ID
โ
Structured Conversations
โ
Generate Embeddings
โ
Store in Vector Database
โ
MCP Semantic Search API
โ
Any LLM Client Access
- Local First: All data stored locally by default
- SSL/TLS: mitmproxy CA certificate for HTTPS inspection
- No Cloud: Conversations never leave your machine (except OpenAI API for embeddings)
- Content Hash: Prevents accidental duplicate storage
- Proxy Control: Easy on/off toggle for privacy
Once set up, you can ask your AI assistant:
"What did I discuss about system architecture last week?"
"Show me all conversations where I talked about Python optimization"
"Find the conversation where I got help with React hooks"
"What database recommendations did I receive recently?"
- Support for more LLM providers (Gemini, Perplexity, etc.)
- Web UI for conversation browsing
- Export to Markdown/PDF
- Custom embedding models
- Conversation analytics and insights
- Multi-user support
- Cloud sync options
- Advanced filtering and tagging
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is open source and available under the MIT License.
- mitmproxy - Powerful HTTP/HTTPS proxy
- Qdrant - High-performance vector database
- OpenAI - Embedding models
- Model Context Protocol - Standard for AI context sharing
Tushin Kulshreshtha - @Dextron04
Project Link: https://github.com/Dextron04/dex_bridge
Built with โค๏ธ to bridge the context gap between AI conversations