Skip to content

feat(session): expose LLM response headers on assistant messages#26090

Open
jtbnz wants to merge 1 commit intoanomalyco:devfrom
jtbnz:feat/expose-response-headers
Open

feat(session): expose LLM response headers on assistant messages#26090
jtbnz wants to merge 1 commit intoanomalyco:devfrom
jtbnz:feat/expose-response-headers

Conversation

@jtbnz
Copy link
Copy Markdown

@jtbnz jtbnz commented May 6, 2026

Issue for this PR

Closes #26091

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

When using a LiteLLM proxy with an auto router, the actual model selected is only in the HTTP response headers (x-litellm-model-api-base, llm_provider-x-ms-deployment-name). The AI SDK captures these on finish-step events via value.response.headers, but processor.ts discards them.

This adds an optional responseHeaders field to the Assistant message schema and captures value.response.headers in the finish-step handler. The pattern already exists for APIError in the same file.

Two files changed:

  • packages/opencode/src/session/message-v2.ts — added responseHeaders: Schema.optional(Schema.Record(Schema.String, Schema.String))
  • packages/opencode/src/session/processor.ts — read value.response.headers when non-empty and store on the assistant message

The field is optional and messages are stored as JSON, so no migration is needed and existing messages are unaffected. Providers that don't return headers simply won't have the field set.

Example plugin using this: opencode-routed-model

How did you verify your code works?

  • bun turbo typecheck — all 12 packages pass
  • bun test test/session/ from packages/opencode — 296 pass, 0 fail
  • Manual test against a live LiteLLM proxy with complexity router: confirmed x-litellm-model-api-base and llm_provider-x-ms-deployment-name headers appear on the assistant message and are visible to plugins via message.updated bus events

Screenshots / recordings

Not a UI change.

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

@github-actions github-actions Bot added needs:compliance This means the issue will auto-close after 2 hours. and removed needs:compliance This means the issue will auto-close after 2 hours. labels May 6, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 6, 2026

Thanks for updating your PR! It now meets our contributing guidelines. 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

LLM response headers are discarded, preventing plugins from accessing proxy routing metadata

1 participant