Skip to content

Conversation

@briteroses
Copy link

@briteroses briteroses commented Feb 13, 2024

Two fixes:

  • Error handling and retries if openai api gives a bad response during a topk call
  • Implemented logprobs topk dict for chat models, so overall can run extract_logprobs exact method on any openai model

@briteroses briteroses changed the title Quick fix in OpenAIModel Fleshing out OpenAIModel Feb 20, 2024
max_tokens=1,
logprobs=5,
)
for i in range(self.max_api_retries):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Author

@briteroses briteroses Feb 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't see a backoff strategy in the openai client, and it looks like they recommend using other libraries (or implementing from scratch) to do retries with backoff here:

https://cookbook.openai.com/examples/how_to_handle_rate_limits#retrying-with-exponential-backoff

Would you be down for using the tenacity or backoff libraries they mention in that cookbook link, or even keeping the current implementation?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've used and like the tenacity library -- I thought after the refactor of the openai python client library, they recommended not using external libraries for retrying. Do you know whether the cookbook is outdated, or how outdated?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I looked around a bit more, and looks like the same rate limit guide is in the API documentation itself:

https://platform.openai.com/docs/guides/rate-limits?context=tier-free

The final call is yours on whether to keep a manual backoff strategy or just increase max_retries in the openai client, it should be fine either way!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants