Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
107 changes: 107 additions & 0 deletions .github/workflows/ci-cd.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,21 @@ on:
branches: [ main, develop ]
pull_request:
branches: [ main, develop ]
workflow_dispatch:
inputs:
environment:
description: 'Deployment environment'
required: true
default: 'dev'
type: choice
options:
- dev
- stage
- prod

permissions:
contents: read
packages: write

jobs:
python-backend:
Expand Down Expand Up @@ -113,3 +125,98 @@ jobs:
- name: Run contract tests
working-directory: ./contracts
run: npm test

deploy-containers:
name: Build and Push Container Images
runs-on: ubuntu-latest
if: github.event_name == 'workflow_dispatch' || (github.event_name == 'push' && github.ref == 'refs/heads/main')
needs: [python-backend, node-frontend, contracts]
permissions:
contents: read
packages: write
strategy:
matrix:
environment: ${{ github.event_name == 'workflow_dispatch' && fromJSON(format('["{0}"]', github.event.inputs.environment)) || fromJSON('["dev"]') }}
environment: ${{ matrix.environment }}
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The workflow references GitHub environments (dev, stage, prod) via the environment: key, but there's no documentation about setting these up in the repository settings. GitHub environments need to be created and configured in the repository settings with appropriate protection rules and environment-specific secrets/variables before this workflow can run successfully. Consider adding documentation in the README about setting up these GitHub environments, or handle missing environments gracefully.

Suggested change
environment: ${{ matrix.environment }}

Copilot uses AI. Check for mistakes.

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Log in to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

- name: Extract metadata for backend image
id: meta-backend
uses: docker/metadata-action@v5
with:
images: ghcr.io/${{ github.repository }}/backend
tags: |
type=ref,event=branch
type=sha,prefix=${{ matrix.environment }}-
type=raw,value=${{ matrix.environment }}-latest

- name: Extract metadata for frontend image
id: meta-frontend
uses: docker/metadata-action@v5
with:
images: ghcr.io/${{ github.repository }}/frontend
tags: |
type=ref,event=branch
type=sha,prefix=${{ matrix.environment }}-
type=raw,value=${{ matrix.environment }}-latest

- name: Build and push backend image
uses: docker/build-push-action@v5
with:
context: ./backend
push: true
tags: ${{ steps.meta-backend.outputs.tags }}
labels: ${{ steps.meta-backend.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
build-args: |
ENVIRONMENT=${{ matrix.environment }}
OPENAI_API_KEY=${{ secrets.OPENAI_API_KEY }}
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Passing sensitive credentials like OPENAI_API_KEY as Docker build arguments is a security risk. Build arguments are visible in the image history via docker history and can be extracted even after the image is built. Secrets should be passed at runtime through environment variables or using Docker secrets/BuildKit secrets for build-time needs. Consider removing this build arg and setting the API key at container runtime instead, or use BuildKit's --secret flag if it's needed during the build.

Suggested change
OPENAI_API_KEY=${{ secrets.OPENAI_API_KEY }}

Copilot uses AI. Check for mistakes.
AI_PROVIDER=${{ vars.AI_PROVIDER }}
VECTOR_PROVIDER=${{ vars.VECTOR_PROVIDER }}
WEB3_CHAIN=${{ vars.WEB3_CHAIN }}
WEB3_RPC_URL=${{ vars.WEB3_RPC_URL }}
MESSAGING_PROVIDER=${{ vars.MESSAGING_PROVIDER }}
STORAGE_PROVIDER=${{ vars.STORAGE_PROVIDER }}
Comment on lines +185 to +193
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The CI/CD workflow passes multiple build arguments (ENVIRONMENT, OPENAI_API_KEY, AI_PROVIDER, etc.) to the backend Docker build, but the backend Dockerfile does not declare any ARG instructions to receive these values. These build args are unused and will have no effect. If these values are needed, add ARG declarations in the Dockerfile and set them as ENV variables where appropriate, or rely on runtime environment variables instead.

Suggested change
build-args: |
ENVIRONMENT=${{ matrix.environment }}
OPENAI_API_KEY=${{ secrets.OPENAI_API_KEY }}
AI_PROVIDER=${{ vars.AI_PROVIDER }}
VECTOR_PROVIDER=${{ vars.VECTOR_PROVIDER }}
WEB3_CHAIN=${{ vars.WEB3_CHAIN }}
WEB3_RPC_URL=${{ vars.WEB3_RPC_URL }}
MESSAGING_PROVIDER=${{ vars.MESSAGING_PROVIDER }}
STORAGE_PROVIDER=${{ vars.STORAGE_PROVIDER }}

Copilot uses AI. Check for mistakes.

- name: Build and push frontend image
uses: docker/build-push-action@v5
with:
context: ./frontend
push: true
tags: ${{ steps.meta-frontend.outputs.tags }}
labels: ${{ steps.meta-frontend.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
build-args: |
ENVIRONMENT=${{ matrix.environment }}
NEXT_PUBLIC_API_URL=${{ vars.NEXT_PUBLIC_API_URL }}
NEXT_PUBLIC_RPC_URL=${{ vars.NEXT_PUBLIC_RPC_URL }}
NEXT_PUBLIC_CHAIN_ID=${{ vars.NEXT_PUBLIC_CHAIN_ID }}
Comment on lines +204 to +208
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The CI/CD workflow passes multiple build arguments (ENVIRONMENT, NEXT_PUBLIC_API_URL, NEXT_PUBLIC_RPC_URL, NEXT_PUBLIC_CHAIN_ID) to the Docker build, but the frontend Dockerfile does not declare any ARG instructions to receive these values. These build args are unused and will have no effect. If these environment variables are needed in the Next.js build or runtime, add ARG declarations before they are used and set them as ENV variables where appropriate.

Suggested change
build-args: |
ENVIRONMENT=${{ matrix.environment }}
NEXT_PUBLIC_API_URL=${{ vars.NEXT_PUBLIC_API_URL }}
NEXT_PUBLIC_RPC_URL=${{ vars.NEXT_PUBLIC_RPC_URL }}
NEXT_PUBLIC_CHAIN_ID=${{ vars.NEXT_PUBLIC_CHAIN_ID }}

Copilot uses AI. Check for mistakes.

- name: Display deployment info
run: |
echo "### Deployment Summary :rocket:" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Environment:** ${{ matrix.environment }}" >> $GITHUB_STEP_SUMMARY
echo "**Backend Image:** ghcr.io/${{ github.repository }}/backend:${{ matrix.environment }}-latest" >> $GITHUB_STEP_SUMMARY
echo "**Frontend Image:** ghcr.io/${{ github.repository }}/frontend:${{ matrix.environment }}-latest" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Required Secrets/Variables:**" >> $GITHUB_STEP_SUMMARY
echo "- OPENAI_API_KEY (secret)" >> $GITHUB_STEP_SUMMARY
echo "- AI_PROVIDER, VECTOR_PROVIDER, WEB3_CHAIN, WEB3_RPC_URL (variables)" >> $GITHUB_STEP_SUMMARY
echo "- MESSAGING_PROVIDER, STORAGE_PROVIDER (variables)" >> $GITHUB_STEP_SUMMARY
echo "- NEXT_PUBLIC_API_URL, NEXT_PUBLIC_RPC_URL, NEXT_PUBLIC_CHAIN_ID (variables)" >> $GITHUB_STEP_SUMMARY
25 changes: 25 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
[package]
name = "web3ai"
version = "1.0.0"
edition = "2021"

[dependencies]
# AI Stack
async-openai = "0.25.1"
tokenizers = "0.20.3"
candle-core = "0.8.0"

# Web3 Stack
ethers = "2.0.14"
solana-sdk = "2.1.4"
solana-client = "2.1.4"
anchor-lang = "0.30.1"

# Messaging
slack-morphism = { version = "2.5.0", features = ["hyper"] }
serenity = { version = "0.12.2", features = ["client", "gateway", "rustls_backend"] }

# Data
tokio-postgres = "0.7.12"
redis = { version = "0.27.5", features = ["tokio-comp"] }
aws-sdk-s3 = "1.66.0"
Comment on lines +1 to +25
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The root Cargo.toml defines a package with dependencies but doesn't specify a library path pointing to sdk/rust/lib.rs, and there's no src/lib.rs or src/main.rs in the project root. This makes the Cargo.toml invalid for building. Either add a [lib] section with path = "sdk/rust/lib.rs" to the Cargo.toml, create a proper Cargo workspace with sdk/rust/Cargo.toml as a member, or move the Rust SDK code to src/lib.rs in the project root.

Copilot uses AI. Check for mistakes.
Copy link

Copilot AI Feb 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Cargo.toml declares dependencies for AI, Web3, messaging, and data integrations, but these dependencies are not imported or used anywhere in the Rust SDK code at sdk/rust/lib.rs. The SDK only contains stub implementations with TODO comments. These unused dependencies will still be compiled and linked, significantly increasing build times and binary size. Consider either implementing the actual SDK functionality that uses these dependencies, or removing them from Cargo.toml until they are actually needed.

Suggested change
aws-sdk-s3 = "1.66.0"

Copilot uses AI. Check for mistakes.
186 changes: 180 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,86 @@ npm run node

## 🔧 Environment Variables

### AI/Web3 Integration Adapters

The project includes comprehensive SDK adapters (`sdk/` directory) for integrating AI and Web3 services across multiple languages:

- **TypeScript/Node.js**: `@lippytm/ai-sdk` - see `sdk/typescript/`
- **Python**: `ai_sdk` module - see `sdk/python/`
- **Go**: `aisdk` package - see `sdk/go/`
- **Rust**: `aisdk` crate - see `sdk/rust/`

#### Available Providers

**AI Providers:**
- OpenAI (GPT-4, GPT-3.5, etc.)
- Hugging Face (Transformers, Inference API)
- LangChain (orchestration)
- LlamaIndex (data framework)

**Vector Stores (Optional):**
- Pinecone (managed vector database)
- Weaviate (open-source vector search)
- Chroma (embeddings database)

**Web3 Chains:**
- Ethereum (via ethers.js, web3.py)
- Solana (via @solana/web3.js, solana-py)
- Extensible for additional chains

**Messaging Platforms:**
- Slack (via Slack SDK)
- Discord (via Discord.js/discord.py)

**Data Storage:**
- PostgreSQL (via pg/asyncpg)
- Redis (via ioredis/redis)
- S3 (via AWS SDK)
- IPFS (via ipfs-http-client)

#### Required Environment Variables

**AI Configuration:**
```env
AI_PROVIDER=openai # openai, huggingface, custom
AI_API_KEY=your-api-key-here
AI_MODEL=gpt-4 # model name
```

**Vector Store (Optional):**
```env
VECTOR_PROVIDER=pinecone # pinecone, weaviate, chroma
VECTOR_API_KEY=your-vector-api-key
VECTOR_ENDPOINT=https://... # for weaviate/chroma
VECTOR_INDEX=your-index-name
```

**Web3 Configuration:**
```env
WEB3_CHAIN=ethereum # ethereum, solana, custom
WEB3_RPC_URL=https://eth.llamarpc.com
WEB3_NETWORK=mainnet # mainnet, testnet, devnet
```

**Messaging (Optional):**
```env
MESSAGING_PROVIDER=slack # slack, discord
MESSAGING_TOKEN=your-bot-token
```

**Storage (Optional):**
```env
STORAGE_PROVIDER=postgres # postgres, redis, s3, ipfs
STORAGE_CONNECTION_STRING=postgresql://...
STORAGE_ENDPOINT=https://... # for S3/IPFS
STORAGE_BUCKET=your-bucket-name # for S3
```

**Notes:**
- Install vector stores separately: `pip install -r requirements.txt` includes commented optional dependencies
- For production, use secret management (GitHub Secrets, AWS Secrets Manager, etc.)
- Linux compatibility: All dependencies support Ubuntu 20.04+

### Backend (.env)

```env
Expand Down Expand Up @@ -294,39 +374,133 @@ GitHub Actions automatically runs on push/PR to main:
1. **Python Backend Job**: Runs ruff linter and pytest
2. **Node Frontend Job**: Runs ESLint and builds Next.js app
3. **Contracts Job**: Compiles contracts and runs Hardhat tests
4. **Container Deploy Job**: Builds and pushes Docker images to ghcr.io (on main merge or manual dispatch)

See `.github/workflows/ci-cd.yml` for configuration.
### Container Deployment

The workflow includes automated container builds for backend and frontend:

**Trigger Methods:**
- Automatic on merge to `main` branch (dev environment)
- Manual workflow dispatch with environment selection (dev/stage/prod)

**Container Registry:**
- Backend image: `ghcr.io/lippytm/web3ai/backend`
- Frontend image: `ghcr.io/lippytm/web3ai/frontend`

**Environment Matrix:**
- `dev`: Development environment (auto-deploy on main)
- `stage`: Staging environment (manual dispatch)
- `prod`: Production environment (manual dispatch)

**Required GitHub Secrets/Variables:**
- `GITHUB_TOKEN`: Automatic (for ghcr.io push)
- `OPENAI_API_KEY`: Secret for OpenAI access
- `AI_PROVIDER`, `VECTOR_PROVIDER`, `WEB3_CHAIN`, `WEB3_RPC_URL`: Repository variables
- `MESSAGING_PROVIDER`, `STORAGE_PROVIDER`: Repository variables
- `NEXT_PUBLIC_API_URL`, `NEXT_PUBLIC_RPC_URL`, `NEXT_PUBLIC_CHAIN_ID`: Repository variables

See `.github/workflows/ci-cd.yml` for full configuration.

## 📦 Dependencies

### Backend (Python)

**Core Framework:**
- `fastapi`: Modern web framework
- `uvicorn[standard]`: ASGI server
- `pydantic`: Data validation
- `pydantic-settings`: Settings management
- `httpx`: Async HTTP client
- `python-dotenv`: Environment variable loading

**AI Stack:**
- `openai`: OpenAI API client
- `transformers`: Hugging Face transformers
- `huggingface-hub`: Hugging Face model hub
- `langchain`: LLM orchestration framework
- `langchain-openai`: OpenAI integration for LangChain
- `llama-index`: Data framework for LLMs

**Web3 Stack:**
- `web3`: Ethereum library
- `langchain-openai`: OpenAI integration
- `solana`: Solana blockchain library

**Messaging:**
- `slack-sdk`: Slack API client
- `discord.py`: Discord API client

**Data Storage:**
- `asyncpg`: Async PostgreSQL driver
- `redis`: Redis client
- `boto3`: AWS SDK for S3
- `ipfshttpclient`: IPFS HTTP client

**Development & Testing:**
- `pytest`: Testing framework
- `ruff`: Linter and formatter
- `pytest-asyncio`: Async test support
- `ruff`: Fast Python linter/formatter
- `black`: Code formatter

**Optional (Vector Stores):**
- `pinecone-client`: Pinecone vector database
- `weaviate-client`: Weaviate vector search
- `chromadb`: Chroma embeddings database

### Frontend (Node/TypeScript)

**Core Framework:**
- `next`: React framework
- `react`: UI library
- `react-dom`: React DOM renderer
- `typescript`: Type safety
- `eslint`: Linter
- `prettier`: Code formatter
- `@typescript-eslint/*`: TypeScript ESLint plugins

**AI Stack:**
- `openai`: OpenAI API client
- `@huggingface/inference`: Hugging Face inference
- `langchain`: LLM orchestration
- `llamaindex`: Data framework for LLMs

**Web3 Stack:**
- `ethers`: Ethereum library
- `viem`: Modern Ethereum library
- `wagmi`: React hooks for Ethereum
- `@solana/web3.js`: Solana JavaScript API
- `@coral-xyz/anchor`: Solana framework

**Messaging:**
- `@slack/web-api`: Slack Web API client
- `discord.js`: Discord API client

**Data Storage:**
- `pg`: PostgreSQL client
- `ioredis`: Redis client
- `@aws-sdk/client-s3`: AWS S3 client
- `ipfs-http-client`: IPFS HTTP client

**Development & Styling:**
- `eslint`: Linter
- `prettier`: Code formatter
- `@typescript-eslint/*`: TypeScript ESLint plugins
- `tailwindcss`: Utility-first CSS framework
- `postcss`: CSS processing
- `autoprefixer`: CSS vendor prefixing

### Contracts (Hardhat)

- `hardhat`: Development environment
- `@nomicfoundation/hardhat-toolbox`: Hardhat plugins bundle
- `dotenv`: Environment variables

### SDK Adapters

**Multi-language support:**
- TypeScript/Node.js: `@lippytm/ai-sdk`
- Python: `ai_sdk` module
- Go: `aisdk` package
- Rust: `aisdk` crate

See `sdk/` directory for language-specific adapters with factory/config patterns.

## 🛠️ Development Workflow

Expand Down
Loading
Loading