🤖 fix: show provider error details in AI debug logs#2791
Conversation
|
@codex review |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 4ef8586ffa
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
4ef8586 to
87da28c
Compare
|
@codex review Addressed your feedback by ensuring |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 87da28c814
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
Handle provider object errors via getErrorMessage, add regression tests, and remove duplicate middleware helper. --- _Generated with `mux` • Model: `openai:gpt-5.3-codex` • Thinking: `xhigh` • Cost: `$1.95`_ <!-- mux-attribution: model=openai:gpt-5.3-codex thinking=xhigh costs=1.95 -->
87da28c to
eb75d55
Compare
|
@codex review Applied your follow-up suggestion:
|
|
Codex Review: Didn't find any major issues. Keep it up! ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
Summary
Fix AI Debug Logs error rendering so provider object payloads no longer appear as
Error: [object Object].Background
Some providers return structured error objects rather than
Errorinstances. The middleware's local helper and the shared utility previously fell back toString(error)for non-Errorvalues, collapsing useful payloads into[object Object].Implementation
getErrorMessageto:messagefrom plain objectsJSON.stringifyfor object/array payloadsString(error)when serialization fails (e.g. circular refs)extractErrorMessagefromdevToolsMiddlewareand reused sharedgetErrorMessageat all middleware callsites.Validation
bun test src/common/utils/errors.test.tsbun test src/node/services/__tests__/devToolsMiddleware.test.tsmake typecheckmake static-checkRisks
Low risk. The change is scoped to error-message normalization and covered by targeted tests plus existing middleware tests.
📋 Implementation Plan
Fix "Error: [object Object]" in AI Debug Logs
Context
The DevTools debug logs panel shows
Error: [object Object]instead of the actual error message when an AI provider returns an error. LLM providers return plain objects (e.g.{ message: "rate limit exceeded", code: 429 }), notErrorinstances, and two utility functions naively callString(error)on them — producing"[object Object]".Root Cause
Two functions with the same bug:
extractErrorMessageinsrc/node/services/devToolsMiddleware.ts:30–36— private helper, 4 call sites within the file (lines 433, 565, 640, 683). Direct cause of the screenshot.getErrorMessageinsrc/common/utils/errors.ts:8–9— shared utility, ~387 call sites app-wide. SameString(error)fallback for non-Errorvalues.extractErrorMessageis strictly a worse duplicate ofgetErrorMessage(no cause-chain walking). After fixinggetErrorMessage, we can eliminateextractErrorMessageentirely.Phase 1 — Red: Write Failing Tests
File:
src/common/utils/errors.test.tsAdd test cases that capture the exact behavior we want. These will fail against the current
String(error)fallback:Verify:
bun test src/common/utils/errors.test.ts— expect 3+ failures (plain object, JSON, array tests fail with"[object Object]"instead of expected values).Phase 2 — Green: Fix
getErrorMessageFile:
src/common/utils/errors.ts— replace the early return for non-Errorvalues (~8 LoC net).Key behaviors:
{ message: "rate limit" }→"rate limit"(extract.message){ code: 429 }→'{"code":429}'(JSON stringify for objects without.message){ message: "", code: 500 }→'{"message":"","code":500}'(empty.messagetreated as absent)"[object Object]"(graceful fallback via try/catch aroundJSON.stringify)"boom",42,null→ unchanged (String())Verify:
bun test src/common/utils/errors.test.ts— all tests pass, including the existing ones (no regressions on Error instances, cause chains, etc.).Phase 3 — Refactor: Eliminate
extractErrorMessageextractErrorMessagein devToolsMiddleware.ts is now a strictly worse duplicate. Replace all 4 call sites with the shared utility and delete the private function.File:
src/node/services/devToolsMiddleware.tsimport { getErrorMessage } from "@/common/utils/errors";extractErrorMessagefunction (lines 30–36, ~7 LoC removed)error: extractErrorMessage(error)→error: getErrorMessage(error)streamError = extractErrorMessage(chunk.error)→streamError = getErrorMessage(chunk.error)Verify:
bun test src/common/utils/errors.test.ts— still greenbun test src/node/services/__tests__/devToolsMiddleware.test.ts— no regressionsmake typecheck— cleanNet LoC: ~+15 (product: ~+5, tests: ~+20)
Adds ~12 lines to
getErrorMessage+ tests, removes ~7 lines (extractErrorMessage+ its call-site differences).Generated with
mux• Model:openai:gpt-5.3-codex• Thinking:xhigh• Cost:$1.95