Running AI models locally is amazing, but there's a big problem: AI models are HUGE and will quickly fill up your internal storage!
- Single models can be 4GB-70GB+ each
- Multiple models easily consume 100-500GB of space
- Your MacBook's SSD is expensive and limited
- Managing multiple AI apps (Ollama, OpenWebUI, LM Studio) becomes tedious
This kit provides a single command-line interface to manage all your AI applications while keeping models on external storage. Think of it as a "control center" for your local AI setup.
- π Keep your internal drive free - Models live on external SSD
- π One launcher for everything - Ollama, OpenWebUI, LM Studio
- π Easy model management - Update, download, remove models
- π§³ Portable setup - Take your models anywhere
- π Clean shutdowns - Stop all AI services properly
- π Monitor everything - See what's running and storage usage
This is a personal tool that I'm sharing as-is. I do not actively maintain or support this project.
- β Feel free to fork and modify for your own use
- β Submit pull requests if you want (no guarantee of response)
- β I don't provide support or answer issues
- β No guarantees of updates or bug fixes
Use at your own risk. The MIT license covers the terms.
A comprehensive command-line launcher and management system for AI models on macOS, designed to work seamlessly with external SSD storage. This script provides an elegant interface to launch and manage Ollama, OpenWebUI, and LM Studio applications while keeping your valuable AI models on external storage to save internal disk space.
Perfect for you if:
- β You run AI models locally (Ollama, LM Studio, etc.)
- β You're tired of AI models filling up your MacBook's storage
- β You want a single interface to manage multiple AI applications
- β You need to use models across multiple Mac computers
- β You like clean, organized terminal interfaces
- β You want to easily share models with others
Use cases:
- Developers building AI-powered applications
- Researchers experimenting with different models
- Students learning about AI and machine learning
- Content creators using AI for writing, coding, or analysis
- Teams sharing curated model collections
- Anyone who wants to keep their expensive MacBook SSD free!
- π Universal Launcher: Single interface for all your AI applications
- πΎ External Storage: Keep models on external SSD to preserve internal storage
- π Model Management: Update, pull, and remove models interactively
- π System Monitoring: View running processes and storage usage
- π Complete Shutdown: Kill all AI applications cleanly
- π¨ Colorful Interface: Beautiful terminal UI with emojis and colors
- π Symbolic Links: Automatic model directory linking
- π³ Docker Integration: Seamless OpenWebUI container management
- β‘ Performance Optimized: Smart process detection and management
- macOS (tested on macOS 12+)
- Ollama - AI model runtime
- Docker Desktop - For OpenWebUI
- LM Studio (optional) - GUI model interface
- External SSD/Drive - For AI model storage (recommended: USB 3.0+ or Thunderbolt)
- 8GB+ RAM - For running AI models efficiently
- 100GB+ External Storage - AI models can be large (1GB-70GB+ each)
READ THIS BEFORE RUNNING THE SCRIPT! This tool makes several changes to your system:
~/.ollama/models β /Volumes/[YOUR-DRIVE]/AI-Models/ollama-models
~/.cache/lm-studio/models β /Volumes/[YOUR-DRIVE]/AI-Models/lm-studio-modelsImpact: These replace your existing model directories. Any existing models will be backed up with timestamp.
# On your external drive:
/Volumes/[YOUR-DRIVE]/AI-Models/
βββ ollama-models/
βββ lm-studio-models/
βββ shared-converted/
# Backup directories (if existing models found):
~/.ollama/models.backup.[TIMESTAMP]
~/.cache/lm-studio/models.backup.[TIMESTAMP]- Creates container:
open-webui(persists after script ends) - Downloads image:
ghcr.io/open-webui/open-webui:main(~500MB) - Creates volume:
open-webuifor data persistence - Port binding: 8080 β container port 8080
- Administrator access: May be required for Docker operations
- External drive: Must remain connected during AI operations
- Homebrew: Will be installed if missing (with your permission)
- Applications: Ollama, Docker Desktop, LM Studio must be installed
- Symbolic links: Remain until manually removed
- Docker containers: Continue running in background
- External drive: Can be disconnected, but breaks functionality
- Models: Stay on external drive (safe)
# Remove symbolic links:
rm ~/.ollama/models
rm ~/.cache/lm-studio/models
# Restore original directories (if backups exist):
mv ~/.ollama/models.backup.[TIMESTAMP] ~/.ollama/models
mv ~/.cache/lm-studio/models.backup.[TIMESTAMP] ~/.cache/lm-studio/models
# Remove Docker container:
docker stop open-webui
docker rm open-webui
docker rmi ghcr.io/open-webui/open-webui:main
# Stop AI processes:
pkill ollamaWhat this script accesses:
- π File system: Creates/modifies directories in your home folder
- π Symbolic links: Redirects model storage to external drive
- π Network: Downloads Docker images and AI models from internet
- π» System processes: Starts/stops Ollama, Docker, and LM Studio
- π³ Docker daemon: Creates containers and manages images
What this script does NOT do:
- β No data collection: Doesn't send usage data anywhere
- β No internet monitoring: Only downloads when you request models
- β No credential storage: Doesn't save passwords or tokens
- β No system modifications: Beyond documented changes above
- β No background telemetry: All operations are local
Recommended security practices:
- β Review the script before running (it's open source!)
- β Use trusted external drives for model storage
- β Keep Docker updated for security patches
- β Monitor disk usage - AI models are large
- β Regular backups of your model collections
git clone https://github.com/[your-username]/llm-launch-kit.git
cd llm-launch-kitEdit the script to match your setup:
- Open
llm-launch.shin your favorite text editor - Find the configuration section (around line 13):
# External drive configuration (change these to match your setup) EXTERNAL_DRIVE_NAME="YOUR-EXTERNAL-DRIVE" # β CHANGE THIS
- Change
YOUR-EXTERNAL-DRIVEto your actual external drive name:EXTERNAL_DRIVE_NAME="AI-Models-SSD" # Example EXTERNAL_DRIVE_NAME="Samsung-T7" # Example EXTERNAL_DRIVE_NAME="ROCKET-nano" # Example
chmod +x llm-launch.shFor system-wide access, create a symlink:
sudo ln -sf "$(pwd)/llm-launch.sh" /usr/local/bin/llm-launchThen you can run llm-launch from anywhere in your terminal.
./llm-launch.shThe script will:
- β Check if your external drive is connected
- π Show current model status and any available updates
- π― Present a menu with launch options
1οΈβ£ Ollama Only
- Starts just the Ollama server
- Perfect for CLI usage and API access
- Uses models from your external drive
2οΈβ£ Ollama + OpenWebUI
- Starts Ollama server
- Launches OpenWebUI web interface
- Auto-opens browser to http://localhost:8080
- Full ChatGPT-like web interface
3οΈβ£ LM Studio
- Launches LM Studio application
- GUI-based model management
- Uses models from your external drive
4οΈβ£ Update Models
- Interactive model update menu
- Update existing models
- Pull new models
- Remove old models
5οΈβ£ System Status
- Show all running AI processes
- Display model statistics and storage usage
- Check system health
6οΈβ£ Kill All Apps
- Safely stop Ollama, Docker, and LM Studio
- Clean shutdown of all AI services
- Frees system resources
The script creates this structure on your external drive:
/Volumes/[YOUR-DRIVE]/AI-Models/
βββ ollama-models/ # Ollama model storage
βββ lm-studio-models/ # LM Studio model storage
βββ shared-converted/ # For shared/converted models
And creates these symbolic links on your system:
~/.ollama/models β /Volumes/[YOUR-DRIVE]/AI-Models/ollama-models
~/.cache/lm-studio/models β /Volumes/[YOUR-DRIVE]/AI-Models/lm-studio-models
EXTERNAL_DRIVE_NAME="YOUR-DRIVE" # Drive name as shown in /Volumes/
EXTERNAL_DRIVE="/Volumes/$EXTERNAL_DRIVE_NAME" # Full path to drive
MODELS_DIR="$EXTERNAL_DRIVE/AI-Models" # Models directoryOPENWEBUI_PORT="8080" # Web interface port
OPENWEBUI_CONTAINER_NAME="open-webui" # Docker container nameYes! Your models are now completely portable! This is one of the best features of this setup.
- All models live on your external drive (e.g., Samsung T7, ROCKET nano)
- The script creates symbolic links that redirect apps to the external drive
- Take your drive to any Mac and run the setup scripts
- Instantly access all your models on the new machine
- Work & Home: Use the same models on your work MacBook and home iMac
- Travel: Take your AI setup anywhere with just your laptop + external drive
- Collaboration: Share your curated model collection with teammates
- Backup: Your models are separate from your system - safe from OS reinstalls
# 1. Connect your external drive
# 2. Clone this repo on the new Mac
# 3. Edit the drive name in llm-launch.sh
# 4. Run setup:
./setup-links.sh
./llm-launch.sh
# 5. All your models are instantly available!Note: You'll need to install Ollama/Docker/LM Studio on each new Mac, but your models and configurations travel with your drive.
- Use simple, consistent drive names
- Avoid spaces and special characters
- Examples:
AI-Models-SSD,Samsung-T7,ROCKET-nano
- Start with smaller models (7B parameters) to test setup
- Popular models to try:
llama3.2:latest- Latest Meta Llama modelqwen2.5:latest- Excellent general-purpose modeldeepseek-r1:latest- Great for reasoning taskscodellama:latest- Specialized for coding
- 7B models: ~4-8GB each
- 13B models: ~8-16GB each
- 34B models: ~20-40GB each
- 70B models: ~40-80GB each
- Use USB 3.0+ or Thunderbolt external drives
- SSD performs much better than HDD for AI models
- Keep external drive connected when using AI apps
- Monitor available space before downloading large models
β External drive (YOUR-DRIVE) not found!
Solutions:
- Ensure external drive is connected and mounted
- Check drive name matches configuration
- Try
ls /Volumes/to see available drives
β Docker failed to start within 60 seconds
Solutions:
- Start Docker Desktop manually first
- Check Docker Desktop settings
- Ensure sufficient system resources
β οΈ Ollama not in PATH - skipping update check
Solutions:
- Install Ollama from https://ollama.com
- Restart terminal after installation
- Check
which ollamaworks
β Failed to create symbolic link
Solutions:
- Run with appropriate permissions
- Check external drive is not read-only
- Ensure destination directories exist
- If existing models directory:
mv ~/.ollama/models ~/.ollama/models.backup
β Models not found after setup
Check symbolic links:
# Verify links are correct:
ls -la ~/.ollama/models
ls -la ~/.cache/lm-studio/models
# Should show: ~/.ollama/models -> /Volumes/[YOUR-DRIVE]/AI-Models/ollama-modelsSolutions:
- Ensure external drive is connected
- Verify drive name matches configuration
- Check external drive has proper permissions
β OpenWebUI not accessible at localhost:8080
Diagnostic steps:
# Check if container is running:
docker ps | grep open-webui
# Check container logs:
docker logs open-webui
# Restart container if needed:
docker restart open-webuiβ Models suddenly unavailable
What happens:
- Symbolic links become broken (normal)
- Ollama will show "no models found"
- OpenWebUI may show connection errors
Solution:
- Reconnect external drive
- Models become available immediately
- No need to restart applications
- The script only manages local AI applications
- No internet connections except for model downloads
- All data stays on your local machine and external drive
- Docker containers use local networking only
This project builds upon the incredible work of the open-source AI community:
- Ollama - For making AI models accessible and easy to run locally
- Open WebUI - For the beautiful web interface
- LM Studio - For the excellent GUI model management
- Docker Community - For containerization that makes deployment seamless
Special thanks to the AI/ML community for sharing knowledge about local model management and the challenges of storage optimization.
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
- Support for Linux distributions
- Integration with more AI applications
- Model conversion utilities
- Performance optimization features
- Better error handling and recovery
MIT License - feel free to use and modify as needed.
Having issues? Check:
- Requirements: Ensure all prerequisite applications are installed
- Configuration: Verify drive name and paths are correct
- Permissions: Make sure script is executable
- Resources: Ensure sufficient disk space and memory
- β Multi-application launcher
- β External drive support
- β Model management
- β System monitoring
- β Complete shutdown functionality
- β Colorful terminal interface
Made with β€οΈ and π for the AI community