School Project: AI Engineer Course
November 2025
This vertical slice demonstrates an AI-powered pipeline that translates natural language commands into Blender 3D operations. The system uses GPT-4 to understand user intent and routes commands to a 3D modeling application.
User Input (Web UI / Terminal)
↓
FastAPI Backend (REST API)
↓
LangGraph Agent (GPT-4o - Single Node)
↓
Tool Router (Python if/else)
↓
Blender MCP Socket Client (port 9876)
↓
Blender 3D Application
- Natural Language Processing: Converts plain English to structured commands
- Simple Agent Architecture: Single-node LangGraph for command interpretation
- Tool Routing: Maps AI decisions to specific Blender operations
- Socket Communication: Direct integration with Blender via TCP socket
- Dual Interface: Web UI and terminal client for testing
- Backend: FastAPI (Python)
- AI Agent: LangGraph + OpenAI GPT-4o
- 3D Software: Blender 3.0+ with MCP addon
- Frontend: Vanilla HTML/CSS/JavaScript
- Communication: REST API + TCP Sockets
vertical_slice/
├── backend/
│ ├── main.py # FastAPI server + endpoints
│ ├── agent.py # LangGraph agent (single node)
│ ├── router.py # Tool routing logic
│ └── mcp_client.py # Blender socket client
├── ui/
│ └── index.html # Web interface
├── test_client.py # Terminal test client
├── requirements.txt # Python dependencies
└── README.md # This file
- Python 3.10+
- Blender 3.0+
- OpenAI API Key
- Blender Addons:
- Blender MCP (for socket communication)
- QRemeshify (for topology optimization)
cd vertical_slice
pip install -r requirements.txtCreate .env file in project root:
OPENAI_API_KEY=your-openai-api-key-here
BLENDER_HOST=localhost
BLENDER_PORT=9876A) Blender MCP Addon:
- Download:
../repos/blender-mcp/addon.py - Open Blender → Edit → Preferences → Add-ons
- Click "Install..." → Select
addon.py - Enable "Interface: Blender MCP"
B) QRemeshify Addon:
- Download from: https://github.com/ksami/QRemeshify/releases
- Open Blender → Edit → Preferences → Add-ons
- Click dropdown arrow (top right) → "Install from Disk..."
- Select downloaded zip file
- Enable "QRemeshify"
- In Blender, press
N(opens sidebar) - Find "BlenderMCP" tab
- Click "Connect to Claude" button
- ✅ Server running on port 9876
Terminal 1 - Start Backend:
cd vertical_slice
python -m backend.mainBrowser:
Open: http://localhost:8000
Type commands like:
- "create a cube"
- "create a sphere"
Terminal 1 - Start Backend:
python -m backend.mainTerminal 2 - Send Commands:
# Interactive mode
python test_client.py
# Single command
python test_client.py "create a cube"| Command | Description | Example |
|---|---|---|
create a cube |
Creates a 2x2 cube | Default size |
create a sphere |
Creates a UV sphere | Default radius 1.0 |
import mesh from [path] |
Imports OBJ/FBX file | Full file path required |
remesh [object] |
Remeshes object to clean quads | "remesh suzanne" |
User types natural language command in UI or terminal
POST /run
Body: {"text": "create a cube"}Single node processes input with GPT-4o:
- Parses user intent
- Extracts parameters
- Returns structured JSON:
{"tool": "create_cube", "args": {"size": 2.0}}
Simple Python function:
if tool_name == "create_cube":
return create_cube(size=args.get("size", 2.0))Sends JSON command to Blender:
{
"type": "execute_code",
"params": {
"code": "import bpy\nbpy.ops.mesh.primitive_cube_add(size=2.0)"
}
}Blender MCP addon receives command, executes, returns status
Input: "create a cube"
Expected: Cube appears in Blender
Input: "create a sphere"
Expected: Sphere appears in Blender
Input: "what's the weather?"
Expected: Error message - "Please provide a Blender command"
- ✅ Check Blender is running
- ✅ Check MCP addon is active
- ✅ Click "Connect to Claude" in Blender sidebar
- ✅ Check
.envfile exists in project root - ✅ Verify API key starts with
sk- - ✅ Restart server after creating
.envfile
- ✅ Check FastAPI server is running
- ✅ Access UI via
http://localhost:8000notfile://
- ✅ Deactivate venv:
deactivate - ✅ Use system Python for this project
# Find what's using port 8000
netstat -ano | findstr :8000
# Kill specific process (replace PID with number from above)
taskkill /F /PID <PID>
# Kill all Python processes (nuclear option)
taskkill /F /IM python.exe# 1. Check if port is free
netstat -ano | findstr :8000
# 2. Kill existing process
taskkill /F /PID <PID>
# 3. Make sure you're in correct directory
cd C:\Users\TIGO\Desktop\vertical_slice
# 4. Start server
python -m backend.main# Always restart server after code changes!
# 1. Stop server: Ctrl+C
# 2. Start again: python -m backend.main# Check if in venv (prompt shows (.venv))
# If yes, deactivate:
deactivate
# Then run normally:
python -m backend.main# Kill all Python, restart fresh
taskkill /F /IM python.exe; cd C:\Users\TIGO\Desktop\vertical_slice; python -m backend.main- Simplicity: Easier to debug and understand
- Sufficient: No complex reasoning required for command parsing
- Performance: Lower latency than multi-node graphs
- Direct: No middleware or protocol overhead
- Real-time: Immediate feedback from Blender
- Existing: Leverages existing Blender MCP addon
- PoC Focus: Demonstrates concept, not production polish
- No Build Tools: Single HTML file, easy to run
- Time Efficient: Built in 30 minutes
- Add more tools (lights, cameras, materials)
- Implement conversation memory
- Add 3D model generation (Trellis integration)
- Support for complex multi-step workflows
- Error recovery and retry logic
- Asset library browser
- LangGraph Architecture: Single-node agents are sufficient for simple routing tasks
- Socket Communication: Direct integration is faster than HTTP for local apps
- Error Handling: Clear error messages are critical for debugging
- Dual Interfaces: Terminal + Web UI provides flexibility during development
Course: AI Engineer
Focus: Practical AI agent implementation
Scope: Vertical slice demonstrating core pipeline
Time: Completed in one evening (~6 hours)
Lines of Code: ~600 (excluding UI components)
Educational project - free to use and modify.
- Blender MCP: Open-source addon for Blender integration
- LangChain/LangGraph: Agent framework
- OpenAI: GPT-4o API
- FastAPI: Modern Python web framework
Built with focus on clarity, simplicity, and demonstrable results.