Skip to content

about llm endpoint #456

@kyk666123

Description

@kyk666123

If I want to use VERL for training, and within the agent pipeline, some nodes use local models while others call commercial APIs—while only training the open-source model—do I need to pass an agl.LLM object (which wraps the actual endpoint and model name of the commercial LLM) via the resources parameter when initializing the agent class?
Additionally, for the local model, are its endpoint and model name directly specified in VERL’s configuration file, and will there automatically be a main_llm field in resources that I can access?
Finally, do both the local model and the commercial API model require calling llm.get_base_url() to obtain their final endpoints?
Thank you very much for your time and guidance—I really appreciate any clarification you can provide!

Metadata

Metadata

Assignees

No one assigned

    Labels

    proxyquestionQuestion about a feature or some usageverl

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions