Skip to content

Conversation

@RulaKhaled
Copy link
Member

When Sentry instruments the OpenAI client, we return an instrumented promise. The OpenAI SDK returns a custom promise-like object with extra methods like .withResponse(), so returning a plain Promise breaks those methods.

We added wrapPromiseWithMethods() to proxy SDK methods back onto the instrumented promise. However, for streaming calls, .withResponse() returned the original uninstrumented stream, so we fixed it by intercepting .withResponse() in the proxy:

  • call original .withResponse() to get metadata (response, request_id)
  • await the instrumented promise
  • return the same wrapper but swap data to the instrumented stream

Closes #19073

@linear
Copy link

linear bot commented Feb 2, 2026

@github-actions
Copy link
Contributor

github-actions bot commented Feb 2, 2026

Codecov Results 📊


Generated by Codecov Action

@github-actions
Copy link
Contributor

github-actions bot commented Feb 2, 2026

size-limit report 📦

⚠️ Warning: Base artifact is not the latest one, because the latest workflow run is not done yet. This may lead to incorrect results. Try to re-run all tests to get up to date results.

Path Size % Change Change
@sentry/browser 25.43 kB - -
@sentry/browser - with treeshaking flags 23.9 kB - -
@sentry/browser (incl. Tracing) 42.27 kB - -
@sentry/browser (incl. Tracing, Profiling) 46.92 kB - -
@sentry/browser (incl. Tracing, Replay) 80.91 kB - -
@sentry/browser (incl. Tracing, Replay) - with treeshaking flags 70.52 kB - -
@sentry/browser (incl. Tracing, Replay with Canvas) 85.61 kB - -
@sentry/browser (incl. Tracing, Replay, Feedback) 97.79 kB - -
@sentry/browser (incl. Feedback) 42.15 kB - -
@sentry/browser (incl. sendFeedback) 30.11 kB - -
@sentry/browser (incl. FeedbackAsync) 35.13 kB - -
@sentry/browser (incl. Metrics) 26.54 kB - -
@sentry/browser (incl. Logs) 26.69 kB - -
@sentry/browser (incl. Metrics & Logs) 27.36 kB - -
@sentry/react 27.14 kB - -
@sentry/react (incl. Tracing) 44.52 kB - -
@sentry/vue 29.87 kB - -
@sentry/vue (incl. Tracing) 44.09 kB - -
@sentry/svelte 25.44 kB - -
CDN Bundle 27.97 kB - -
CDN Bundle (incl. Tracing) 43.04 kB - -
CDN Bundle (incl. Logs, Metrics) 28.82 kB - -
CDN Bundle (incl. Tracing, Logs, Metrics) 43.87 kB - -
CDN Bundle (incl. Replay, Logs, Metrics) 67.75 kB - -
CDN Bundle (incl. Tracing, Replay) 79.8 kB - -
CDN Bundle (incl. Tracing, Replay, Logs, Metrics) 80.67 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) 85.23 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback, Logs, Metrics) 86.13 kB - -
CDN Bundle - uncompressed 81.83 kB - -
CDN Bundle (incl. Tracing) - uncompressed 127.54 kB - -
CDN Bundle (incl. Logs, Metrics) - uncompressed 84.66 kB - -
CDN Bundle (incl. Tracing, Logs, Metrics) - uncompressed 130.37 kB - -
CDN Bundle (incl. Replay, Logs, Metrics) - uncompressed 208.04 kB - -
CDN Bundle (incl. Tracing, Replay) - uncompressed 244.14 kB - -
CDN Bundle (incl. Tracing, Replay, Logs, Metrics) - uncompressed 246.96 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed 256.94 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback, Logs, Metrics) - uncompressed 259.75 kB - -
@sentry/nextjs (client) 46.87 kB - -
@sentry/sveltekit (client) 42.66 kB - -
@sentry/node-core 52.18 kB - -
@sentry/node 166.52 kB +0.14% +220 B 🔺
@sentry/node - without tracing 93.97 kB - -
@sentry/aws-serverless 109.48 kB - -

View base workflow run

@github-actions
Copy link
Contributor

github-actions bot commented Feb 2, 2026

node-overhead report 🧳

Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.
⚠️ Warning: Base artifact is not the latest one, because the latest workflow run is not done yet. This may lead to incorrect results. Try to re-run all tests to get up to date results.

Scenario Requests/s % of Baseline Prev. Requests/s Change %
GET Baseline 8,891 - 8,917 -0%
GET With Sentry 1,605 18% 1,621 -1%
GET With Sentry (error only) 5,856 66% 6,016 -3%
POST Baseline 1,158 - 1,207 -4%
POST With Sentry 542 47% 582 -7%
POST With Sentry (error only) 1,011 87% 1,052 -4%
MYSQL Baseline 3,165 - 3,330 -5%
MYSQL With Sentry 419 13% 421 -0%
MYSQL With Sentry (error only) 2,512 79% 2,679 -6%

View base workflow run

const result = client.chat.completions.create({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Test withResponse' }],
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test doesn't cover streaming .withResponse() regression

Low Severity

The PR description states the fix is specifically for streaming calls where .withResponse() returned the original uninstrumented stream. However, the test scenario only tests non-streaming calls (no stream: true parameter). According to the review rules for fix PRs, tests should verify the specific regression being fixed. A streaming test case with .withResponse() would provide confidence that the core bug is actually fixed. Flagging this because the review rules file specifies that fix PRs should include tests for the specific regression.

Fix in Cursor Fix in Web


// Special handling for .withResponse() to preserve instrumentation
// .withResponse() returns { data: T, response: Response, request_id: string }
if (prop === 'withResponse' && typeof value === 'function') {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Q: we only apply special handling here for withResponse, do we not have this issue with other custom OpenAI methods (e.g. asResponse)?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The proxy should correctly route other functions exposed, .withResponse() is different because it wraps the parsed data in an object, and we need to replace that data field with our instrumented version while preserving the response. For asResponse specifically, it returns Promise which should be handled already

Comment on lines +214 to +227
const useInstrumentedPromise = prop in Promise.prototype || prop === Symbol.toStringTag;
const source = useInstrumentedPromise ? instrumentedPromise : target;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

m: can we add a comment here to explain what's happening here? not super obvious I think at first glance

Copy link
Member Author

@RulaKhaled RulaKhaled Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure thing, we check if we need to instrumented promise that has Sentry tracing, or to use the original object for custom methods like .withResponse()

Will add a comment

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.

@RulaKhaled RulaKhaled force-pushed the rolaabuhasna/js-1594-sentry-openai-instrumentation-breaks-withresponse branch from c4734fd to 36cccca Compare February 6, 2026 15:01
@RulaKhaled RulaKhaled force-pushed the rolaabuhasna/js-1594-sentry-openai-instrumentation-breaks-withresponse branch from 36cccca to 129d794 Compare February 6, 2026 15:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Sentry OpenAI Instrumentation breaks withResponse()

2 participants