Documentation Index
Fetch the complete documentation index at: https://docs.getbifrost.ai/llms.txt
Use this file to discover all available pages before exploring further.
Changelog
This release pulls in the OSS v1.4.23 wave: Claude Opus 4.7 support, Anthropic structured outputs, MCP tool annotations, and a large batch of provider-streaming and reliability fixes. The build toolchain reverts to Go 1.26.1.✨ Features
- Claude Opus 4.7 Support - Full compatibility with Anthropic’s Claude Opus 4.7, including adaptive thinking, the task-budgets beta header,
displayparameter handling, andxhigheffort mapping - Anthropic Structured Outputs -
response_formatand structured output support across chat completions and the Responses API, covering JSON-schema and JSON-object formats with order-preserving merge of additional model request fields - MCP Tool Annotations - Preserve MCP tool annotations (
title,readOnly,destructive,idempotent,openWorld) in bidirectional MCP ↔ Bifrost chat tool conversion so agents can reason about tool behavior - Anthropic Server Tools - Anthropic chat schema and Responses converters now surface server-side tools (web search, code execution, computer use containers) end-to-end
🐞 Fixed
- Provider Queue Shutdown Panic - Eliminated
send on closed channelpanics during provider queue shutdown; stale producers transparently re-route to new queues duringUpdateProvider, with rollback on failed updates - OpenAI Responses Tool Output - Flattened array-form
tool_resultoutput into a newline-joined string for the Responses API so strict upstreams (Ollama Cloud, typed openai-go models) no longer reject it with HTTP 400; non-text blocks preserved - vLLM Token Usage - Treats
delta.content=""the same asnilin streaming so the synthesis chunk retains itsfinish_reason, restoring token usage attribution - Bedrock Streaming & Tools - Emits
message_stopfor Anthropic invoke stream, case-insensitiveanthropic-betaheader merging, and preserves image blocks in tool results when converting Anthropic Messages to Bedrock Converse - Gemini Tool Outputs & Thinking Level - Handles content-block tool outputs for
function_call_outputand preservesthinkingLevelparameters across round-trip conversions with corrected finish-reason mapping - Responses Streaming Errors - Mid-stream errors in the Responses API are now captured so transport clients see failures instead of silent termination
- Anthropic WebSearch & Fallbacks - Removed the Claude Code user-agent restriction so WebSearch tool arguments flow for all clients; fallback fields are dropped from outgoing Anthropic requests to avoid schema validation errors
- Async Context Propagation - Preserve context values in async requests so downstream handlers retain request-scoped data
- Custom Providers - Allow custom providers without a list-models endpoint to accept any model rather than restricting on virtual key registration
- OTEL Plugin -
insecuredefaults totruein config.json and emitted OTEL metrics now include fallbacks - Config Schema Validator - Corrected JSON-path lookups for concurrency and SCIM blocks, and reformatted
transports/config.schema.jsonfor readability - Helm Chart - Validation fixes, prerelease tag removed, and
mcpClientConfigtemplating corrected - CI Egress Hardening -
step-security/harden-runnerswitched fromaudittoblockacross all GitHub Actions workflows with explicitallowed-endpointsper job
🛠️ Build
- Go 1.26.1 - Build toolchain reverted from Go 1.26.2 back to Go 1.26.1
📀 Base OSS version
transports/v1.4.23
