Skip to content

Adds support for logging LM token usage for streamed responses. Closes #1345 #1346

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jul 25, 2025

Conversation

waldekmastykarz
Copy link
Collaborator

Adds support for logging LM token usage for streamed responses. Closes #1345

@Copilot Copilot AI review requested due to automatic review settings July 25, 2025 11:38
@waldekmastykarz waldekmastykarz added the pr-bugfix Fixes a bug label Jul 25, 2025
@waldekmastykarz waldekmastykarz requested a review from a team as a code owner July 25, 2025 11:38
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds support for logging language model token usage for streamed responses in the OpenAI telemetry plugin. The enhancement addresses the need to properly extract and log token usage information from Server-Sent Events (SSE) streaming responses.

  • Adds logic to detect streaming responses by checking for text/event-stream content type
  • Implements parsing of streaming response chunks to extract the final token usage data
  • Modifies the response processing flow to handle both regular and streaming responses

@waldekmastykarz waldekmastykarz merged commit dd64272 into dotnet:main Jul 25, 2025
4 checks passed
@waldekmastykarz waldekmastykarz deleted the fix-llm-streaming branch July 25, 2025 12:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr-bugfix Fixes a bug
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG]: Emitting OpenAI usage information fails for streaming requests
2 participants