This service acts as an integration layer for the Volpe Framework. It is designed to bridge the gap between external clients (such as an LLM controller or a user interface) and the core Volpe execution environment.
The service accepts code submissions (structured as Jupyter notebooks) and optional data files, packages them into self-contained Docker environments, and dispatches them to the Volpe Framework for optimization and execution. It also provides a streaming endpoint to relay real-time results back to the client.
To run this service locally, ensure you have the following installed:
- VolPE Framework master and worker up and running in port 8000 (WSL or Linux) - refer VolPE Framework dev docs (yet ot be built)
- Python 3.10 or higher.
- Docker Desktop or Docker Engine: The service requires access to the Docker daemon to build and export images. Ensure Docker is running before starting the application.
- uv (Recommended): This project uses
uvfor fast package management, though standardpipcan also be used.
-
Clone the repository
git clone https://github.com/Evolutionary-Algorithms-On-Click/volpe-integration
-
Install dependencies If using
uv:uv sync
Or using standard
pip:pip install -r requirements.txt
Start the FastAPI server using fastapi or uvicorn.
# Using fastapi CLI
fastapi run app/main.py --port 9000 --reload
The service will be available at http://localhost:9000.
You can also run the service using Docker. The service is configured to run on port 6000 inside the container.
docker-compose up --buildThe service will be available at http://localhost:6000.
-
Build the image:
docker build -t volpe-integration . -
Run the container:
docker run -p 6000:6000 -v /var/run/docker.sock:/var/run/docker.sock volpe-integration
NOTE: The container needs access to /var/run/docker.sock to be able to build and save the optimization job images.
Endpoint: POST /submit
This endpoint accepts a multipart/form-data request to submit a new optimization job. Builds the .tar file and sends to volpe framework
Parameters:
request_data(Form Field, Required): A JSON string representing the optimization request. It must validate against theOptimizationRequestschema. Typically a the same notebook structure returned from EvOC v2 LLM Microservice- Structure:
{ "user_id": "string", "notebook_id": "string", "notebook": { "cells": [ ... ], "requirements": "numpy\npandas" }, "preferences": { ... } }
- Structure:
file(File Field, Optional): A binary file (e.g.,data.csv) that will be uploaded and placed into the execution container's working directory. If a file nameddata.csvis uploaded, it will override any default data file.
Response:
Returns a JSON object containing the status, message, and the problem_id.
Endpoint: GET /results/{problem_id}
This endpoint opens a Server Sent Events (SSE) stream to relay updates from the Volpe Framework for a specific problem.
Response: A text stream of events (standard SSE format).
Endpoint: GET /
Returns a simple status message to verify the service is operational.
When a request is received at /submit, the service performs the following steps:
- Validation: It parses the JSON
request_dataand validates it against the internal Pydantic models. - Context Creation:
- It extracts code cells from the provided notebook.
- It wraps the user code using a standard template (
wrapper_template.py) to ensure compatibility with the Volpe runtime. - It prepares a tmp dir containing the generated
main.py, aDockerfile,requirements.txt, and necessary protobuf definitions. - If a file was uploaded, it is written to this directory.
- Image Build: It uses the local Docker daemon to build a Docker image tagged with the notebook ID.
- Export: The built image is immediately exported as a tarball stream.
- Dispatch: The tarball and metadata are uploaded to the configured
VOLPE_FRAMEWORK_URL. - Cleanup: The Docker image is removed from the local daemon to save space, though the tarball is temporarily cached in
build_artifacts/and the main.py indebug_artifacts/for debugging purposes.