Skip to content

Conversation

@hggzm
Copy link

@hggzm hggzm commented Jan 23, 2026

Summary

Adds debug logging for token usage metrics from OpenAI API responses.

Changes

  • Added token usage logging in OpenAIChatModel.send()
  • Logs prompt_tokens, completion_tokens, and total_tokens
  • Uses existing logger infrastructure at debug level

Benefits

  • Enables cost tracking and optimization
  • Helps debug context window issues
  • Non-intrusive (debug level only)

Fixes #2119, #2002

Log prompt_tokens, completion_tokens, and total_tokens from OpenAI
API responses when usage data is available. This provides visibility
into token consumption for observability and cost tracking.

Addresses Issues #2119, #2002

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant