-
Notifications
You must be signed in to change notification settings - Fork 0
Description
During the same course of experimentation that caused me to uncover #67 I also realized that we need more fidelity in terms of what variables get exposed to models as configurable parts of the Provider call.
For context what I wanted to do was create a provider that would exclusively serve the Anthropic Haiku model with a maximum output of 1024 tokens (necessary since fixed optimistic pricing + JIT payments mean I can't debit an Operator based on my costs I can only discover after the fact). The curl for this looks like:
curl https://api.anthropic.com/v1/messages \
--header "x-api-key: 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" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-haiku-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Please write a poem about songbirds"}
]
}'
When configured in the Provider client it looks like this:
Aside from the #67 issue, this Provider is problematic because what I have actually created is a generic Anthropic API provider since I have no choice but to specify model and max_tokens as mutable variables that can be changed by the LLM. This means anybody with access to the Provider (which wasn't published so don't try it) can cause me to be debited an arbitrary amount of money by Anthropic while paying a fixed cost. Not good!
Solution is to allow me to declare certain variables in advance, and the only ones that are user (agent) submitted are those that I have not defined. In this case I would prefill/lock max_tokens at 1024, model as "claude-3-5-haiku-20241022", and the "anthropic-version" header for good measure (this isn't really a vulnerability in the same way the others are, but no need to pollute context since the API format is locked in anyways). Then the Operator just has to submit the "messages" array and be done with it all.