-
Notifications
You must be signed in to change notification settings - Fork 28
Open
Description
Description:
Right now, the project is designed for OpenAI endpoints. I propose adding support for Ollama, which runs LLMs locally via http://localhost:11434. This will allow users to run models offline without requiring API keys.
Proposed Changes:
- Detect when Base URL points to Ollama (
http://localhost:11434). - Update
/fetch-modelsroute to query Ollama’s/api/tagsendpoint. - Adjust
/save-settingsso it doesn’t require an API key when using Ollama. - Ensure dropdown populates with installed Ollama models (e.g.,
tinyllama:latest,llama3.2:3b...).
Next Steps:
Patch app.py to support Ollama endpoints, update README with instructions, and test with installed models.
Note: I’m open to implementing this feature myself.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels