A lightweight, modern desktop chat interface for Ollama with advanced MCP (Model Context Protocol) integration. Built with Electron and Angular for cross-platform compatibility.
- Local AI Chat: Direct integration with Ollama for private, local LLM conversations
- Multi-Model Support: Switch between different Ollama models seamlessly
- Streaming Responses: Real-time streaming chat for responsive interactions
- Session Management: Persistent chat sessions with automatic saving
- Smart Titles: AI-generated conversation titles with emoji enhancement
- Dynamic Tool Discovery: Automatic detection and integration of MCP servers
- Desktop Commander: Built-in file system operations and command execution
- Custom Tools: Support for team-specific and custom MCP tools
- Real-time Tool Execution: Seamless tool calling during conversations
- Dark/Light Themes: System-aware theme switching with manual override
- Responsive Design: Split-pane interface with resizable sidebar
- Keyboard Shortcuts: Efficient navigation and interaction
- Cross-Platform: Native desktop experience on Windows, macOS, and Linux
- Prompt Composer: Enhanced system prompts with intelligent tool guidance
- Conversation Export: Save and share chat sessions
- Multi-Selection: Bulk operations on chat sessions
- Configurable Settings: Customizable Ollama host, models, and MCP servers
-
Download the latest release from the Releases page
-
Install Ollama on your system:
# macOS brew install ollama # Linux curl -fsSL https://ollama.ai/install.sh | sh # Windows # Download from https://ollama.ai/
-
Start Ollama and pull a model:
ollama serve ollama pull llama3:latest
-
Run LIT Desktop:
# Linux AppImage chmod +x lit-desktop-*.AppImage ./lit-desktop-*.AppImage # Windows # Run the installer .exe # macOS # Open the .dmg and drag to Applications
For enhanced system prompts and better tool integration (optional):
npm install system-prompt-composerNote: The app works without system-prompt-composer, but installing it provides enhanced AI prompts that adapt based on available tools and task complexity.
- Node.js (v16 or later)
- npm or yarn
- Angular CLI:
npm install -g @angular/cli - Ollama running locally
-
Clone the repository:
git clone <repository-url> cd lit-desktop
-
Install dependencies:
npm install
-
Development mode:
# Start Angular dev server + Electron npm run electron:dev # With DevTools open npm run electron:dev:devtools
-
Build for production:
# Build for current platform npm run electron:build # Platform-specific builds npm run electron:build:windows npm run electron:build:mac npm run electron:build:linux
LIT Desktop connects to Ollama on http://localhost:11434 by default. To use a different host:
- Open Settings (gear icon or
Ctrl/Cmd + ,) - Update the Ollama Host field
- Click Save Settings
MCP servers are configured via a JSON file at ~/.config/lit-desktop/mcp.json:
{
"mcpServers": {
"desktop-commander": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-everything"],
"name": "desktop-commander"
},
"weather": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-weather"],
"name": "weather"
}
}
}- Select a model from the dropdown
- Type your message in the input field
- Press
Enteror click send - Watch as the AI responds in real-time
LIT Desktop automatically detects when the AI wants to use tools:
You: "What files are in my current directory?"
AI: Let me check that for you.
[Automatically calls desktop-commander.list_directory]
AI: Here are the files in your directory: ...
- New Chat: Click the "+" button or use
Ctrl/Cmd + N - Switch Sessions: Click on any session in the sidebar
- Delete Sessions: Right-click β Delete or use multi-select mode
- Export Chat: Right-click β Export to save as JSON
Ctrl/Cmd + N- New chat sessionCtrl/Cmd + ,- Open settingsCtrl/Cmd + Enter- Send messageEscape- Cancel current responseCtrl/Cmd + Shift + D- Toggle DevTools (development)
- Frontend: Angular 16 with Angular Material
- Desktop: Electron 36 for cross-platform native experience
- Styling: SCSS with theme system
- State Management: RxJS with persistent storage
- Chat Engine: Streaming Ollama integration with tool call detection
- MCP Manager: Dynamic tool discovery and execution
- Session Storage: Persistent chat history with indexing
- Theme System: Responsive dark/light mode switching
- Code Style: Follow Angular and TypeScript best practices
- Testing: Add tests for new functionality
- Documentation: Update README for new features
- Commits: Use conventional commit messages
# Install dependencies
npm install
# Run in development mode
npm run electron:dev
# Run tests
npm test
# Build for production
npm run electron:buildlit-desktop/
βββ src/app/ # Angular application
β βββ components/ # UI components
β βββ services/ # Angular services
β βββ utils/ # Utility functions
βββ electron/ # Electron main process
β βββ services/ # Backend services
β βββ ollama/ # Ollama integration
βββ build/ # Build configuration
"Cannot connect to Ollama"
- Ensure Ollama is running:
ollama serve - Check the host setting in Settings
- Verify firewall settings
"No models available"
- Pull a model:
ollama pull llama3:latest - Restart LIT Desktop after pulling models
"Tools not working"
- Check MCP configuration in
~/.config/lit-desktop/mcp.json - Ensure MCP servers are properly installed
- Check the Settings β Tools tab for server status
"Chat sessions not saving"
- Check write permissions in user data directory
- Clear application data and restart if corrupted
For detailed logging, start with:
# Set debug environment
NODE_ENV=development ./lit-desktop-*.AppImage
# Or enable DevTools in settings- Check existing Issues
- Create a new issue with:
- Operating system and version
- LIT Desktop version
- Ollama version
- Steps to reproduce
- Error logs (if any)
MIT License - see LICENSE for details.
- Ollama - Local LLM inference engine
- Model Context Protocol - Tool integration standard
- Angular and Electron - Application framework
- Anthropic - MCP protocol development
Built with β€οΈ for the local AI community
