A modern, feature-rich Twitch chatbot built with Python 3.11+ and the latest dependencies. This is a complete rewrite of the original MurphyAI bot with improved architecture, better error handling, and modern Python practices.
- OpenAI GPT Integration: Powered by the latest OpenAI API (v1.97.0)
- Context-Aware Responses: Maintains conversation history for better responses
- Smart Rate Limiting: Prevents API abuse with intelligent rate limiting
- Caching System: Reduces API calls with intelligent response caching
- Team-Based Queues: Manage player queues with configurable team sizes
- Overflow Support: Automatic overflow queue when main queue is full
- Advanced Controls: Move users up/down, force join/kick, shuffle teams
- Away System: Users can mark themselves as away with auto-removal
- Dynamic Commands: Add/remove commands on-the-fly
- Command Aliases: Multiple aliases for the same command
- Cooldown Management: Intelligent cooldown system with mod exemptions
- Permission System: Owner, mod, and user permission levels
- Real-time Statistics: Track messages, commands, errors, and uptime
- Health Monitoring: Comprehensive health checks with system metrics
- Persistent State: Automatic state saving and recovery
- Logging: Structured logging with rotation and multiple levels
- Auto-Restart: Automatic restart on crashes with exponential backoff
- Connection Recovery: Intelligent reconnection with retry logic
- Error Handling: Comprehensive error handling and recovery
- State Persistence: Save and restore bot state across restarts
- Python 3.11 or higher
- Twitch account for the bot
- OpenAI API key (optional, for AI features)
-
Clone the repository
git clone <repository-url> cd MurphyAI2
-
Install dependencies
pip install -r requirements.txt
-
Configure the bot
cp env.example .env # Edit .env with your credentials -
Run the bot
python main.py
TWITCH_TOKEN: OAuth token from Twitch Token GeneratorTWITCH_CLIENT_ID: Client ID from Twitch Developer ConsoleTWITCH_INITIAL_CHANNELS: Comma-separated list of channels to join
OPENAI_API_KEY: For AI featuresTWITCH_PREFIX: Command prefix (default:?)MOD_PREFIX: Moderator command prefix (default:\\)LOG_LEVEL: Logging level (default:INFO)
See env.example for all available configuration options.
- Separation of Concerns: Clean separation between bot logic, state management, and events
- Dependency Injection: Components are loosely coupled and easily testable
- Type Safety: Comprehensive type hints throughout the codebase
- Error Handling: Structured error handling with proper logging
The main bot class that orchestrates all components and handles TwitchIO integration.
StateManager: Handles all bot state persistence and recovery
- User tracking and statistics
- Command counters and usage metrics
- Restart tracking and recovery
EventHandler: Processes all bot events in a clean, organized manner
- Message processing and routing
- Connection management
- Error handling and recovery
Command processing and routing system with support for:
- Static commands with counters
- Dynamic commands with aliases
- Permission-based access control
AI integration with OpenAI API including:
- Conversation context management
- Rate limiting and caching
- Error handling and fallbacks
Queue management system for team-based gameplay:
- Multi-queue support (main + overflow)
- Team shuffling and management
- User availability tracking
- TwitchIO 3.0.1: Modern Twitch API integration
- OpenAI 1.97.0: Latest OpenAI API client
- Python-dotenv 1.0.1: Environment variable management
- PSUtil 6.1.0: System monitoring
- APScheduler 3.10.4: Task scheduling
- Requests 2.32.3: HTTP requests
- Watchdog 6.0.0: File system monitoring
- Googletrans 4.0.0rc1: Translation support
?join- Join the queue?leave- Leave the queue?Q- Show current queue?here- Mark yourself as available?nothere- Mark yourself as not available?ai <message>- Chat with AI?joke- Get a random joke?t <text>- Translate text to English?coin- Flip a coin?cannon- Increment cannon counter?quadra- Increment quadra counter?penta- Increment penta counter?botstat- Show bot statistics
\\teamsize <size>- Set team size\\fleave <username>- Force remove user from queue\\fjoin <username>- Force add user to queue\\moveup <username>- Move user up in queue\\movedown <username>- Move user down in queue\\shuffle- Shuffle teams\\clearqueue- Clear all queues\\addcmd <name> <response>- Add dynamic command\\delcmd <name>- Delete dynamic command\\listcmds- List all dynamic commands
\\restart- Restart the bot\\healthcheck- Show detailed health information
MurphyAI2/
├── core/ # Core bot components
│ ├── __init__.py
│ ├── bot.py # Main bot class
│ ├── events.py # Event handling
│ └── state.py # State management
├── config.py # Configuration management
├── constants.py # Application constants
├── commands.py # Command processing
├── ai_command.py # AI integration
├── queue_manager.py # Queue management
├── cooldown_manager.py # Cooldown system
├── dynamic_commands.py # Dynamic command system
├── validation_utils.py # Input validation
├── utils.py # Utility functions
├── main.py # Application entry point
├── requirements.txt # Dependencies
└── .env.example # Configuration template
-
Static Command: Add to
commands.pyasync def handle_mycommand(message, args: str) -> None: await message.channel.send("My response")
-
Dynamic Command: Use in chat
\\addcmd mycommand My response here
- Extend StateManager: Add new state properties
- Add Event Handlers: Extend EventHandler class
- Update Configuration: Add new config options
- Add Tests: Create comprehensive tests
- Location:
logs/murphyai.log - Rotation: 10MB max, 5 backup files
- Levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
state/bot_state.pkl: Bot state and statisticsstate/restart_counter.pkl: Restart trackingstate/ai_cache/: AI response cachingstate/command_backups/: Dynamic command backups
- Use
\\healthcheckfor system health - Monitor logs for errors and warnings
- Check
\\botstatfor runtime statistics
- Backup Data: Save your old
dynamic_commands.jsonand state files - Update Dependencies: Install new requirements
- Update Configuration: Migrate to new
.envformat - Test: Run in development mode first
- Deploy: Switch to production mode
- Fork the repository
- Create a feature branch
- Make your changes with proper type hints
- Add tests for new functionality
- Update documentation
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Original MurphyAI bot inspiration
- TwitchIO library for Twitch integration
- OpenAI for AI capabilities
- Python community for excellent libraries
If you encounter issues:
- Check the logs in
logs/murphyai.log - Verify your configuration in
.env - Use
\\healthcheckto diagnose problems - Check the GitHub issues page
- Create a new issue with detailed information
- Complete rewrite with modern Python practices
- Updated to TwitchIO 3.0.1 and OpenAI 1.97.0
- Improved architecture with proper separation of concerns
- Enhanced error handling and recovery
- Better logging and monitoring
- Comprehensive type hints
- Environment-based configuration
- Automated testing structure
- Security improvements