A Discord-like chat application with an LLM-powered insult battle game. Players compete in real-time arenas where creative insults deal damage, judged by AI with configurable personality and criteria.
Live Demo: https://chat.noino.dev
LLM-Powered Combat Game:
- Real-time multiplayer arena (40x40 tile grid)
- Creative insults scored 0-100 by LLM (Mistral 7B)
- Damage and explosion radius based on insult quality
- Configurable judging criteria per channel (pirate-themed, caveman, Gen Alpha slang, etc.)
- AI-generated commentary in matching personality styles
- Simultaneous text and video chat during gameplay
- ~200ms average latency for LLM evaluation (asynchronous processing)
Full-Featured Chat Application:
- Role-based access control (user/admin/super)
- Group and channel organization
- Real-time text, image, and video chat
- Message persistence with MongoDB
- User avatars and file uploads
- WebRTC many-to-many video chat
Stack: MEAN (MongoDB, Express.js, Angular, Node.js)
Real-time Networking:
- Socket.io for text chat and game state synchronization
- Custom packet protocol with tagged union pattern (10Hz game tick rate)
- PeerJS/WebRTC for peer-to-peer video chat
- Asynchronous LLM processing queue to avoid blocking gameplay
Game State Management:
- In-memory session manager for active games
- O(1) player lookups using username as key
- Server-authoritative game loop at 10Hz
- Client-side prediction for smooth movement
LLM Integration:
- Ollama API integration with Mistral 7B Instruct
- Context-aware prompting with configurable criteria
- Dual LLM calls: scoring + commentary generation
- Asynchronous processing with message queue
WebGL Rendering:
- Custom 2D sprite batching engine
- Immediate-mode UI rendering
- Particle effects for explosions and damage
- Smooth interpolation for all movement
Backend:
- Generic API route with rich command palette (17 commands)
- JWT authentication with role-based authorization
- Context-aware action filtering based on user privilege
- MongoDB integration for users, groups, channels, and messages
Damage Calculation:
radius = score / 10
damage = score * (1 - distance / radius)
Players within the explosion radius take damage proportional to their distance from the origin. Insults are evaluated based on channel-specific criteria set by group admins.
Example Judging Criteria:
- "Insults must be nautical themed" (pirate channel)
- "Only caveman speak allowed" (primitive channel)
- "Must use Gen Alpha slang" (brainrot channel)
Monolithic Component Design:
Chatcomponent handles DOM and Socket.io integrationWebGLRendererclass manages graphics primitivesGameclass maintains local state and animations- Separation of concerns within single component for related systems
Service Layer:
tcp.srv.ts- Socket management and network observablesapi.srv.ts- Centralized API dispatch and LRU cacheauth.srv.ts- JWT session management
Session Manager Structure:
sessions: {
[groupName/channelName]: {
timer: number,
players: {
[username]: {
pos: [x, y],
hp: number,
active: boolean,
update: boolean
}
},
messages: [{msg: string, user: string, dmg: number}],
meta: {
judgeStyle: string,
judgeCriteria: string,
maxUsers: number
}
}
}Full CRUD functionality through generic route with privilege-based filtering:
- User management (create, edit, delete, promote)
- Group management (create, delete, add/remove users)
- Channel management (create, edit, delete)
- Message retrieval and persistence
- Context-aware action filtering (
getAvailableActions)
Frontend:
- Angular
- TypeScript
- Custom WebGL renderer
- Socket.io client
- PeerJS (WebRTC)
Backend:
- Node.js / Express
- MongoDB
- Socket.io
- Ollama (LLM inference)
- JWT authentication
Deployment:
- Docker Compose
- Separate PeerJS STUN server (port 9000)
- Main server on port 3000
# Install dependencies
cd client && npm install
cd ../server && npm install
# Build client
cd client && ng build
# Run with Docker (recommended)
cd server && docker-compose up
# Or run manually
cd server && node server.js
# In separate terminal:
cd server/peerjs && node peer-server.jsTested on RTX 2080 8GB:
# Install Ollama
# Download from https://ollama.ai
# Pull model
ollama pull mistral:7b-instruct
# Run server (Docker will connect automatically)
ollama serve- Game tick rate: 10Hz server-side updates
- LLM latency: ~200ms average per insult (asynchronous)
- Video chat: Many-to-many WebRTC (scales with participant count)
- Message persistence: All chat messages stored in MongoDB
- WebGL rendering: Smooth 60fps with particle effects
- PostgreSQL migration for better relational query performance
- LRU cache for frequently accessed user/group data
- Password hashing (currently plaintext)
- Input validation for channel metadata
- WebRTC bandwidth optimization for large channels
- Client-side state prediction improvements
Repository: https://github.com/noinodev/chat/
