Skip to content

helixprojectai-code/dex_bridge

ย 
ย 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

15 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

ChatGPT Image Oct 27, 2025, 11_21_50 PM

๐ŸŒ‰ Dex Bridge

Long-term Shared Context Between LLMs

Dex Bridge is an innovative system that captures, stores, and enables semantic search across conversations from multiple AI assistants (ChatGPT, Claude) using a local vector database. It creates a persistent memory layer that allows different LLMs to access your conversation history through the Model Context Protocol (MCP).

         []                     []
       .:[]:                  .:[]:
     .: :[]: :.             .: :[]: :.
   .: : :[]: : :.         .: : :[]: : :.
 .: : : :[]: : : :-.___.-: : : :[]: : : :.
_:_:_:_:_:[]:_:_:_:_:_::_:_:_:_ :[]:_:_:_:_:_
^^^^^^^^^^[]^^^^^^^^^^^^^^^^^^^^^[]^^^^^^^^^^
          []                     []

๐ŸŽฏ What Problem Does It Solve?

Every time you start a new conversation with an AI assistant, it has zero context of your previous conversations. Dex Bridge solves this by:

  • ๐Ÿ“ Capturing real-time conversations from ChatGPT and Claude
  • ๐Ÿ’พ Storing them in a vector database with semantic search capabilities
  • ๐Ÿ” Enabling any LLM to search and access your entire conversation history
  • ๐Ÿ”— Bridging the context gap between different AI assistants

๐Ÿ—๏ธ Architecture

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                    Browser / LLM Clients                     โ”‚
โ”‚              (ChatGPT, Claude, GitHub Copilot)               โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                       โ”‚ HTTPS Traffic
                       โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                    MITM Proxy Layer                          โ”‚
โ”‚              (mitmproxy @ localhost:8080)                    โ”‚
โ”‚                                                              โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”         โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”      โ”‚
โ”‚  โ”‚ capture_req.py   โ”‚         โ”‚ capture_claude.py   โ”‚      โ”‚
โ”‚  โ”‚  (ChatGPT)       โ”‚         โ”‚   (Claude.ai)       โ”‚      โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜         โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜      โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                       โ”‚ Intercept & Parse SSE/NDJSON
                       โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                  Processing Pipeline                         โ”‚
โ”‚                                                              โ”‚
โ”‚  parsed_matches/           merge_conversations.py           โ”‚
โ”‚    โ”œโ”€ chatgpt.com/    โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–บ  merged_conversations/ โ”‚
โ”‚    โ”‚   โ””โ”€ conv_id__ts.json              โ”œโ”€ chatgpt.com/    โ”‚
โ”‚    โ””โ”€ claude.ai/                         โ”‚   โ””โ”€ merged.json โ”‚
โ”‚        โ””โ”€ conv_id__ts.json               โ””โ”€ claude.ai/      โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                       โ”‚ Merge by conversation_id
                       โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚              Vector Database (Qdrant)                        โ”‚
โ”‚            store_chat_message.py                             โ”‚
โ”‚                                                              โ”‚
โ”‚  Collection: chat_messages                                   โ”‚
โ”‚  โ”œโ”€ Embeddings (OpenAI text-embedding-3-small)              โ”‚
โ”‚  โ”œโ”€ Metadata (role, timestamp, model, conversation_id)      โ”‚
โ”‚  โ””โ”€ Content Hash (deduplication)                            โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                       โ”‚ Semantic Search
                       โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚            MCP Server (access_llm_memory.py)                 โ”‚
โ”‚                                                              โ”‚
โ”‚  Tool: search_memory(query, top_k)                           โ”‚
โ”‚  โ”œโ”€ Generate query embedding                                 โ”‚
โ”‚  โ”œโ”€ Search Qdrant vector DB                                  โ”‚
โ”‚  โ””โ”€ Return relevant conversation snippets                    โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                       โ”‚ MCP Protocol
                       โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚            VS Code / AI Clients                              โ”‚
โ”‚     (GitHub Copilot, Claude Desktop, etc.)                   โ”‚
โ”‚                                                              โ”‚
โ”‚  Now can access your entire conversation history!           โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

๐Ÿš€ Features

1. Multi-Provider Capture

  • โœ… ChatGPT (chatgpt.com)
  • โœ… Claude (claude.ai)
  • ๐Ÿ”„ Extensible architecture for other LLMs

2. Intelligent Parsing

  • Handles SSE (Server-Sent Events) streams
  • Parses NDJSON (Newline Delimited JSON)
  • Extracts text from complex patch events
  • Preserves conversation structure and metadata

3. Smart Deduplication

  • SHA-256 content hashing
  • Prevents duplicate embeddings
  • Efficient storage management

4. Semantic Search

  • Vector embeddings via OpenAI's text-embedding-3-small
  • Cosine similarity search in Qdrant
  • Context-aware retrieval

5. MCP Integration

  • Standard Model Context Protocol server
  • Easy integration with any MCP-compatible client
  • Accessible from VS Code, Claude Desktop, and more

๐Ÿ“ฆ Installation

Prerequisites

  • Python 3.10+
  • macOS (for proxy management scripts)
  • OpenAI API Key
  • Qdrant (local or cloud instance)

Setup

  1. Clone the repository:
git clone https://github.com/Dextron04/dex_bridge.git
cd dex_bridge
  1. Create and activate virtual environment:
python3 -m venv dexenv
source dexenv/bin/activate  # On macOS/Linux
  1. Install dependencies:
pip install -r dex_bridge/requirements.txt
  1. Configure environment variables:
cd dex_bridge
cp .env.example .env
# Edit .env and add your OPENAI_API_KEY
  1. Install and start Qdrant:
# Using Docker
docker pull qdrant/qdrant
docker run -p 6333:6333 qdrant/qdrant

# Or install locally from https://qdrant.tech/documentation/quick-start/

๐ŸŽฎ Usage

Step 1: Start Capture System

cd dex_bridge
sudo bash dex_bridge.sh

This interactive menu lets you:

  • โœ… Enable/disable system proxy
  • ๐Ÿ“ก Start mitmproxy capture
  • ๐Ÿ” View current status
  • ๐Ÿ” Manage SSL certificates

Step 2: Use ChatGPT/Claude Normally

Browse to ChatGPT or Claude and have conversations as usual. Dex Bridge captures everything automatically in the background.

Step 3: Process Conversations

The system automatically:

  1. Captures streams to parsed_matches/{provider}/
  2. Merges by conversation ID to merged_conversations/{provider}/
  3. Generates embeddings and stores in Qdrant

You can also manually trigger:

# Merge conversations
python merge_conversations.py

# Store in vector DB
python store_chat_message.py

Step 4: Search Your Memory

The MCP server is configured in .vscode/mcp.json:

{
  "servers": {
    "memory_mcp": {
      "type": "stdio",
      "command": "/path/to/dexenv/bin/python3",
      "args": ["/path/to/dex_bridge/memory_mcp/access_llm_memory.py"]
    }
  }
}

Now ask any MCP-compatible AI:

"Search my memory for conversations about Python async programming"

๐Ÿ› ๏ธ Components

1. dex_bridge.sh

  • Main control script
  • Manages macOS network proxy settings
  • Handles SSL certificate installation
  • Interactive TUI for system control

2. MITM Scripts

  • capture_req.py: ChatGPT conversation capture
  • capture_claude.py: Claude conversation capture
  • Real-time SSE/NDJSON parsing
  • Automatic file organization

3. merge_conversations.py

  • Groups conversations by ID
  • Extracts user/assistant exchanges
  • Creates structured JSON output
  • Supports multiple providers

4. store_chat_message.py

  • Generates embeddings via OpenAI API
  • Stores vectors in Qdrant
  • Implements content-based deduplication
  • Preserves rich metadata

5. access_llm_memory.py

  • MCP server implementation
  • search_memory(query, top_k) tool
  • Returns semantically relevant results
  • Includes conversation context

๐Ÿ“Š Data Flow

Raw HTTPS Stream โ†’ MITM Capture โ†’ Parse Events โ†’ Extract Text
                                        โ†“
                                   Parsed JSON
                                        โ†“
                          Merge by Conversation ID
                                        โ†“
                         Structured Conversations
                                        โ†“
                            Generate Embeddings
                                        โ†“
                          Store in Vector Database
                                        โ†“
                         MCP Semantic Search API
                                        โ†“
                          Any LLM Client Access

๐Ÿ”’ Security & Privacy

  • Local First: All data stored locally by default
  • SSL/TLS: mitmproxy CA certificate for HTTPS inspection
  • No Cloud: Conversations never leave your machine (except OpenAI API for embeddings)
  • Content Hash: Prevents accidental duplicate storage
  • Proxy Control: Easy on/off toggle for privacy

๐Ÿงช Example Queries

Once set up, you can ask your AI assistant:

"What did I discuss about system architecture last week?"

"Show me all conversations where I talked about Python optimization"

"Find the conversation where I got help with React hooks"

"What database recommendations did I receive recently?"

๐Ÿ“ˆ Future Enhancements

  • Support for more LLM providers (Gemini, Perplexity, etc.)
  • Web UI for conversation browsing
  • Export to Markdown/PDF
  • Custom embedding models
  • Conversation analytics and insights
  • Multi-user support
  • Cloud sync options
  • Advanced filtering and tagging

๐Ÿค Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

๐Ÿ“ License

This project is open source and available under the MIT License.

๐Ÿ™ Acknowledgments

๐Ÿ“ง Contact

Tushin Kulshreshtha - @Dextron04

Project Link: https://github.com/Dextron04/dex_bridge


Built with โค๏ธ to bridge the context gap between AI conversations

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 72.7%
  • Shell 27.3%