WebGUI-Ollama is a Flask-based web application that serves as an interface to interact with the Ollama API. This project provides a server-side rendered (SSR) frontend, allowing users to easily access and manipulate their Ollama models. Getting Started
Download the Ollama model of your choice from the official Ollama website or other sources.
Once you have downloaded an Ollama model, use the provided tools to pull it onto your local machine. The specific steps for pulling a model will vary depending on the source and format of the model.
ollama run llama3.2
Clone this WebGUI-Ollama repository to a directory on your local machine using Git or another version control system.
Change into the project directory using the command line: cd WebGUI-Ollama
cd WebGUI-Ollama
Install the required Python packages for this project by running: pip install -r requirements.txt
pip install -r requirements.txt
Start the Flask development server by running: python app.py
This will start the web application on http://127.0.0.1:5000. Accessing the WebGUI-Ollama Interface
Once the application is running, open a web browser and navigate to: http://127.0.0.1:5000
You should now see the WebGUI-Ollama interface, which allows you to interact with your Ollama models using the Ollama API.