-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
If I want to use VERL for training, and within the agent pipeline, some nodes use local models while others call commercial APIs—while only training the open-source model—do I need to pass an agl.LLM object (which wraps the actual endpoint and model name of the commercial LLM) via the resources parameter when initializing the agent class?
Additionally, for the local model, are its endpoint and model name directly specified in VERL’s configuration file, and will there automatically be a main_llm field in resources that I can access?
Finally, do both the local model and the commercial API model require calling llm.get_base_url() to obtain their final endpoints?
Thank you very much for your time and guidance—I really appreciate any clarification you can provide!