Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(llmobs): fix token extraction for chat completion streams #12070

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

lievan
Copy link
Contributor

@lievan lievan commented Jan 24, 2025

Fixes token chunk extraction to account for the choices field in a chunk being an empty list

Before

Error generating LLMObs span event for span <Span(id=16151817411339450163,trace_id=137677390470467884790869841527646927357,parent_id=None,name=openai.request)>, likely due to malformed span
Traceback (most recent call last):
  File "/XXXXX/ddtrace/contrib/internal/openai/utils.py", line 118, in __aiter__
    await self._extract_token_chunk(chunk)
  File "/XXXXX/ddtrace/contrib/internal/openai/utils.py", line 157, in _extract_token_chunk
    choice = getattr(chunk, "choices", [None])[0]
IndexError: list index out of range

After

Traced succesfully
image

Checklist

  • PR author has checked that all the criteria below are met
  • The PR description includes an overview of the change
  • The PR description articulates the motivation for the change
  • The change includes tests OR the PR description describes a testing strategy
  • The PR description notes risks associated with the change, if any
  • Newly-added code is easy to change
  • The change follows the library release note guidelines
  • The change includes or references documentation updates if necessary
  • Backport labels are set (if applicable)

Reviewer Checklist

  • Reviewer has checked that all the criteria below are met
  • Title is accurate
  • All changes are related to the pull request's stated goal
  • Avoids breaking API changes
  • Testing strategy adequately addresses listed risks
  • Newly-added code is easy to change
  • Release note makes sense to a user of the library
  • If necessary, author has acknowledged and discussed the performance implications of this PR as reported in the benchmarks PR comment
  • Backport labels are set in a manner that is consistent with the release branch maintenance policy

Copy link
Contributor

github-actions bot commented Jan 24, 2025

CODEOWNERS have been resolved as:

releasenotes/notes/fix-token-extraction-0133808742374ef4.yaml           @DataDog/apm-python
ddtrace/contrib/internal/openai/utils.py                                @DataDog/ml-observability

@datadog-dd-trace-py-rkomorn
Copy link

datadog-dd-trace-py-rkomorn bot commented Jan 24, 2025

Datadog Report

Branch report: evan.li/fix-azure-openai
Commit report: 426d52a
Test service: dd-trace-py

✅ 0 Failed, 130 Passed, 1378 Skipped, 4m 30.37s Total duration (35m 51.08s time saved)

@pr-commenter
Copy link

pr-commenter bot commented Jan 24, 2025

Benchmarks

Benchmark execution time: 2025-01-24 22:24:42

Comparing candidate commit 426d52a in PR branch evan.li/fix-azure-openai with baseline commit 1811db1 in branch main.

Found 0 performance improvements and 0 performance regressions! Performance is the same for 384 metrics, 0 unstable metrics.

@lievan lievan changed the title fix(llmobs): be more defensive on extracting streamed tokens fix(llmobs): fix token extraction for chat completion streams Jan 24, 2025
@lievan lievan marked this pull request as ready for review January 24, 2025 17:32
@lievan lievan requested a review from a team as a code owner January 24, 2025 17:32
@lievan lievan requested a review from a team as a code owner January 24, 2025 17:56
Copy link
Contributor

@Yun-Kim Yun-Kim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving as this fix is low risk and high priority and we're just blocked by our testing framework (vcrpy). Let's follow up this PR with tests next week.

@lievan lievan enabled auto-merge (squash) January 24, 2025 21:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants