Skip to content

Holthuizen/WebGUI-Ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

WebGUI-Ollama

Overview

WebGUI-Ollama is a Flask-based web application that serves as an interface to interact with the Ollama API. This project provides a server-side rendered (SSR) frontend, allowing users to easily access and manipulate their Ollama models. Getting Started

Step 1: Install Ollama

Download the Ollama model of your choice from the official Ollama website or other sources.

Step 2: Pull a Model

Once you have downloaded an Ollama model, use the provided tools to pull it onto your local machine. The specific steps for pulling a model will vary depending on the source and format of the model. ollama run llama3.2

Step 3: Clone this Repository

Clone this WebGUI-Ollama repository to a directory on your local machine using Git or another version control system.

Step 4: Navigate into the Project Directory

Change into the project directory using the command line: cd WebGUI-Ollama cd WebGUI-Ollama

Step 5: Install Requirements

Install the required Python packages for this project by running: pip install -r requirements.txt pip install -r requirements.txt

Step 6: Run the Application

Start the Flask development server by running: python app.py

This will start the web application on http://127.0.0.1:5000. Accessing the WebGUI-Ollama Interface

Once the application is running, open a web browser and navigate to: http://127.0.0.1:5000

You should now see the WebGUI-Ollama interface, which allows you to interact with your Ollama models using the Ollama API.

About

Server side render using Python Flask for interfacing with ollama API

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published