Zero-friction onboarding CLI for RecallBricks - AI memory infrastructure.
Get your agent running with memory in 30 seconds:
npx @recallbricks/agent-cli setup rb_live_your_key_hereRecallBricks provides AI memory infrastructure for your agents. This CLI connects your existing agents to persistent memory with zero configuration.
- Go to dashboard.recallbricks.com/setup
- Choose your agent type (sales assistant, support agent, etc.)
- Accept or customize constitutional memories
- Copy your API key
npx @recallbricks/agent-cli setup rb_live_xxxxxThe CLI will:
- Validate your API key
- Detect your project language (TypeScript/JavaScript/Python)
- Ask for your LLM API key (saved for future projects)
- Generate all necessary files
- Install dependencies
- Offer instant testing
# TypeScript
npx ts-node agent.ts
# JavaScript
node agent.js
# Python
python agent.py| Command | Description |
|---|---|
setup <api_key> |
Set up a new agent from dashboard config |
test |
Test your agent interactively |
config |
Manage saved settings and credentials |
update |
Update agent to latest SDK version |
recallbricks setup rb_live_xxxxx [options]
Options:
-d, --directory <path> Target directory (default: current)
-l, --language <lang> Force language: typescript, javascript, python
--no-install Skip dependency installation
--no-test Skip test promptrecallbricks test [options]
Options:
-c, --config <path> Path to recallbricks-config.jsonrecallbricks config [options]
Options:
--show View saved configuration
--clear Clear all saved credentials
--set-llm <provider> Set default LLM (anthropic, openai, google)recallbricks update [options]
Options:
-d, --directory <path> Agent directory
--dry-run Show changes without applyingAfter setup, you'll have:
my-agent/
├── agent.ts # Main agent with memory integration
├── package.json # Dependencies
├── tsconfig.json # TypeScript config
├── .env # API keys (gitignored)
├── recallbricks-config.json
└── examples/
├── 01-simple-chat.ts
└── 02-memory-usage.ts
- Anthropic (Claude) - Default
- OpenAI (GPT-4)
- Google (Gemini)
The CLI auto-detects which LLM you configured in the dashboard and generates the appropriate code.
import { chat, rb } from './agent';
// Chat with memory context
const response = await chat("What did we discuss last time?");
// Direct memory operations
await rb.save({
text: "User prefers dark mode",
tags: ['preference', 'ui']
});
const memories = await rb.recall({
query: "user preferences",
limit: 5
});# Install dependencies
npm install
# Run in development
npm run dev -- setup rb_test_xxx
# Build
npm run build
# Run built version
npm start -- setup rb_test_xxxMIT