Replies: 1 comment 4 replies
-
|
I'm open to adding more providers, and adding a custom base URL option for the existing ones is definitely a good idea! In the meantime, you might want to have a look at the source code of the |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Pythonista Lab currently supports llama, chatgpt, claude, and gemini. However, many users like myself use APIs from other LLM providers such as openrouter, poe, and qwen. Since lab doesn't allow setting a custom base URL, these providers are inaccessible.
I hope lab can support custom LLMs, ideally allowing users to customize request bodies and response processing through Python. This would enable users to adapt to various APIs independently in the future. Currently, I've created a simple magic command using custom magics to use z.ai's glm, but since I couldn't find a way to place AI-generated results into new cells like the built-in LLMs do, the effect is limited.
Beta Was this translation helpful? Give feedback.
All reactions