Skip to content

MLR and vLLM conflict, code not runnable #110

@azaccor

Description

@azaccor

Hello, I know this is a bit old: llm-models/mistral/mistral-7b/01_b_load_inference_vllm.py, but I'm looking for example code snippets with VLLM working on DBR/MLR. Your markdown refers to MLR 14.0 GPU, which is no longer available, but even using 14.3 LTS ML GPU with the g5.4xlarge gives an error when trying to import vllm.

I've tried my hand at this and I can confirm that this:

%pip install mlflow==2.20.3 vllm==0.4.2 transformers==4.49.0 huggingface_hub==0.27.1 openai==1.65.4 cloudpickle==2.2.1
dbutils.library.restartPython()

Allows you to import vllm as LLM using it in 15.4 LTS ML GPU environment.

Do you have any currently operable examples of using VLLM on Databricks that I can reference?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions