Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
99 changes: 93 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,11 @@ A comprehensive full-stack starter bundle combining AI capabilities with Web3 te
- **Python/FastAPI Backend**: High-performance async API with AI integration
- **Next.js/TypeScript Frontend**: Modern React framework with full TypeScript support
- **Hardhat Smart Contracts**: Professional Solidity development environment
- **AI Integration**: OpenAI GPT-5.1-Codex-Max support via LangChain
- **AI Integration**: OpenAI GPT and Anthropic Claude support via LangChain
- Multiple AI providers (OpenAI, Claude, or both)
- AI chat and streaming capabilities
- AI agents with tools and reasoning
- Specialized agents for code analysis, blockchain analysis, and development assistance
- **Web3 Libraries**: ethers.js, viem, and wagmi for blockchain interactions
- **Production Ready**: Comprehensive testing, linting, and CI/CD pipelines
- **Config Validation**: Runtime configuration validation with Pydantic and Zod
Expand Down Expand Up @@ -103,6 +107,13 @@ npm run node
OPENAI_API_KEY=your-openai-api-key-here
MODEL_NAME=GPT-5.1-Codex-Max

# Anthropic/Claude Configuration
ANTHROPIC_API_KEY=your-anthropic-api-key-here
CLAUDE_MODEL_NAME=claude-3-5-sonnet-20241022

# AI Provider Selection (openai, claude, or both)
AI_PROVIDER=both

# Blockchain Configuration
ETH_RPC_URL=https://eth.llamarpc.com
NETWORK=mainnet
Expand All @@ -129,6 +140,8 @@ NEXT_PUBLIC_CHAIN_ID=1

# AI Model Configuration
NEXT_PUBLIC_MODEL_NAME=GPT-5.1-Codex-Max
NEXT_PUBLIC_CLAUDE_MODEL_NAME=claude-3-5-sonnet-20241022
NEXT_PUBLIC_AI_PROVIDER=both

# Optional Telemetry
NEXT_PUBLIC_TELEMETRY_ENABLED=false
Expand Down Expand Up @@ -280,13 +293,83 @@ npx hardhat run scripts/deploy.js --network sepolia

## 🤖 AI/LLM Configuration

The backend uses OpenAI's API through LangChain for AI capabilities:
The backend supports both OpenAI and Anthropic Claude AI models through LangChain:

### Getting API Keys

1. **OpenAI**: Sign up at [OpenAI Platform](https://platform.openai.com/)
2. **Anthropic Claude**: Sign up at [Anthropic Console](https://console.anthropic.com/)

### Configuration Options

Set `AI_PROVIDER` in `backend/.env`:
- `openai`: Use only OpenAI models
- `claude`: Use only Anthropic Claude models
- `both`: Enable both providers (recommended)

### AI Tools & Capabilities

The backend includes a comprehensive AI toolkit:

#### **1. Chat API**
- Standard chat completions with both OpenAI and Claude
- Streaming responses for real-time interactions
- System prompts and conversation history support
- Endpoint: `POST /api/ai/chat`

#### **2. Template Generation**
- Generate responses using prompt templates with variables
- Dynamic content generation
- Endpoint: `POST /api/ai/generate`

#### **3. AI Agents with Tools**
Autonomous AI agents that can use tools and reason through problems:

- **General Agent**: Web3-focused assistant with blockchain knowledge
- **Code Analysis Agent**: Analyze Solidity code, security audits, gas optimization
- **Blockchain Analyst Agent**: Transaction analysis, wallet tracking, protocol analysis
- **Developer Assistant Agent**: Code generation, debugging help, documentation

Endpoint: `POST /api/ai/agent`

#### **4. Available Providers**
- Check configured providers: `GET /api/ai/providers`

### Example Usage

```python
# Chat with Claude
import httpx

response = httpx.post("http://localhost:8000/api/ai/chat", json={
"messages": [
{"role": "user", "content": "Explain Ethereum smart contracts"}
],
"provider": "claude"
})
print(response.json()["response"])

# Run code analysis agent
response = httpx.post("http://localhost:8000/api/ai/agent", json={
"input": "Analyze this contract for security issues: contract MyToken { ... }",
"agent_type": "code_analysis",
"provider": "claude"
})
print(response.json()["output"])
```

### Supported Models

**OpenAI Models:**
- GPT-4, GPT-5.1-Codex-Max, and newer models
- Set via `MODEL_NAME` environment variable

1. **Get API Key**: Sign up at [OpenAI Platform](https://platform.openai.com/)
2. **Set Environment Variable**: Add `OPENAI_API_KEY` to `backend/.env`
3. **Configure Model**: Set `MODEL_NAME=GPT-5.1-Codex-Max` (or your preferred model)
**Claude Models:**
- claude-3-5-sonnet-20241022 (recommended)
- claude-3-opus, claude-3-sonnet, claude-3-haiku
- Set via `CLAUDE_MODEL_NAME` environment variable

The FastAPI backend exposes AI endpoints at `/api/info` and can be extended with custom AI routes.
The FastAPI backend exposes comprehensive AI endpoints and can be easily extended with custom AI routes and agents.

## 🌐 RPC Configuration

Expand Down Expand Up @@ -361,6 +444,10 @@ See `.github/workflows/security-scan.yml` for configuration.
- `httpx`: Async HTTP client
- `web3`: Ethereum library
- `langchain-openai`: OpenAI integration
- `langchain-anthropic`: Anthropic Claude integration
- `langchain-core`: LangChain core functionality
- `langchain-community`: LangChain community integrations
- `anthropic`: Anthropic Python SDK
- `pytest`: Testing framework
- `ruff`: Linter and formatter
- `black`: Code formatter
Expand Down
7 changes: 7 additions & 0 deletions backend/.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,13 @@
OPENAI_API_KEY=your-openai-api-key-here
MODEL_NAME=GPT-5.1-Codex-Max

# Anthropic/Claude Configuration
ANTHROPIC_API_KEY=your-anthropic-api-key-here
CLAUDE_MODEL_NAME=claude-3-5-sonnet-20241022

# AI Provider Selection (openai, claude, or both)
AI_PROVIDER=both

# Blockchain Configuration
ETH_RPC_URL=https://eth.llamarpc.com
NETWORK=mainnet
Expand Down
95 changes: 94 additions & 1 deletion backend/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,15 @@
# Web3AI Backend

FastAPI backend for Web3AI application.
FastAPI backend for Web3AI application with comprehensive AI capabilities.

## Features

- 🤖 **Dual AI Provider Support**: OpenAI and Anthropic Claude integration
- 🛠️ **AI Tools**: Chat, streaming, template generation
- 🤝 **AI Agents**: Autonomous agents with tools and reasoning
- 🔧 **Web3 Integration**: Blockchain and smart contract support
- ✅ **Comprehensive Testing**: Full test coverage with pytest
- 🔒 **Production Ready**: Config validation, CORS, telemetry

## Setup

Expand All @@ -21,13 +30,72 @@ cp .env.example .env
# Edit .env with your configuration
```

Required environment variables:
- `OPENAI_API_KEY`: Your OpenAI API key (optional if using Claude only)
- `ANTHROPIC_API_KEY`: Your Anthropic API key (optional if using OpenAI only)
- `AI_PROVIDER`: Set to `openai`, `claude`, or `both`

## Development

Run development server:
```bash
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
```

API will be available at: http://localhost:8000

## AI Endpoints

### Get Available Providers
```bash
GET /api/ai/providers
```

### Chat with AI
```bash
POST /api/ai/chat
{
"messages": [{"role": "user", "content": "Hello"}],
"provider": "claude",
"system_prompt": "You are a helpful assistant"
}
```

### Stream Chat
```bash
POST /api/ai/chat/stream
{
"messages": [{"role": "user", "content": "Hello"}],
"provider": "claude"
}
```

### Generate with Template
```bash
POST /api/ai/generate
{
"template": "Hello {name}, you are {age} years old",
"variables": {"name": "Alice", "age": 30},
"provider": "claude"
}
```

### Run AI Agent
```bash
POST /api/ai/agent
{
"input": "What is Ethereum?",
"agent_type": "general",
"provider": "claude"
}
```

Available agent types:
- `general`: Web3-focused assistant
- `code_analysis`: Analyze Solidity code
- `blockchain_analyst`: Transaction and protocol analysis
- `developer_assistant`: Code generation and debugging

## Testing

Run tests:
Expand All @@ -40,6 +108,11 @@ Run with coverage:
pytest --cov=app --cov-report=html
```

Run specific test file:
```bash
pytest tests/test_ai_routes.py -v
```

## Linting & Formatting

Run ruff:
Expand All @@ -57,3 +130,23 @@ Check formatting:
```bash
black --check .
```

## Project Structure

```
backend/
├── app/
│ ├── main.py # FastAPI application
│ ├── settings.py # Configuration settings
│ ├── ai_tools.py # AI tools manager (chat, streaming)
│ ├── ai_agents.py # AI agents with tools
│ ├── ai_routes.py # AI API endpoints
│ └── telemetry.py # OpenTelemetry integration
├── tests/
│ ├── test_main.py
│ ├── test_ai_tools.py
│ ├── test_ai_routes.py
│ └── test_config_validation.py
├── requirements.txt # Core dependencies
└── requirements-extras.txt # Optional heavy dependencies
```
Loading
Loading