A beginner-friendly NestJS template for building AI agents with streaming capabilities and conversation management. This template has been generated using the Agent Initializr and is designed to help you quickly set up a backend service for AI agents.
- 🤖 Ready-to-use AI agent implementation
- 🌊 Real-time streaming responses
- 💾 Conversation history management
- 🔄 Support for multiple LLM providers (OpenAI, Google)
- 📡 Built-in Redis pub/sub for real-time messaging
- 🎯 Clean and maintainable architecture
- Node.js (v18 or higher)
- Docker and Docker Compose (for local development)
- OpenAI API key or Google AI API key
- Start the required services (Redis and PostgreSQL) using Docker Compose:
docker compose up -dThis will start:
- PostgreSQL at
localhost:5433 - Redis at
localhost:6379
You can check the status of the services with:
docker compose ps- Clone the repository
- Install dependencies:
pnpm install
# Format the codebase
pnpm format- Configure your model in
// The default Model is OpenAI
// Change this in AgentFactory
this.agent = AgentFactory.createAgent(ModelProvider.OPENAI, [], postgresCheckpointer);- Create the
.envfile in the root directory:
# Choose your model provider
MODEL_PROVIDER=GOOGLE # or OPENAI
# For Google AI
GOOGLE_GENAI_API_KEY=your_api_key
GOOGLE_GENAI_MODEL=gemini-pro
# For OpenAI
OPENAI_API_KEY=your_api_key
OPENAI_MODEL=gpt-3.5-turbo
# Redis Configuration
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_USERNAME=default
REDIS_PASSWORD=
# App Configuration
PORT=3001- Start the development server:
pnpm run start:dev- When using Postgres Saver as memory, the checkpointer should be initialized before chating with the agent
// For example in agentService
async stream(message: SseMessageDto): Promise<Observable<SseMessage>> {
const channel = `agent-stream:${message.threadId}`;
// !!! it should be run only once
this.agent.initCheckpointer();
// the rest of the code
}POST /api/agent/chat- Send a message to the agentGET /api/agent/stream- Stream agent responses (SSE)GET /api/agent/history/:threadId- Get conversation history
// Chat endpoint
await fetch('http://localhost:3001/api/agent/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
threadId: 'unique-thread-id',
content: [{ type: 'text', text: 'Hello, AI!' }],
type: 'human'
})
});
// Stream endpoint (using EventSource)
const sse = new EventSource('http://localhost:3001/api/agent/stream?threadId=unique-thread-id&content=Hello');
sse.onmessage = (event) => {
const data = JSON.parse(event.data);
console.log(data.content);
};src/
├── agent/ # AI agent implementation
├── api/ # HTTP endpoints and DTOs
└── messaging/ # Redis messaging service
Contributions are welcome! Please feel free to submit a Pull Request.