Overview
Passthrough integrations let you call provider-native API paths and payloads through Bifrost without route-level request/response conversion. When you use passthrough endpoints, the request still flows through Bifrost core logic. You keep Bifrost features such as logging and observability while sending provider-native paths and bodies.Endpoints
/openai_passthrough
Default provider:openai/genai_passthrough
Default provider:gemini(with automatic Vertex detection for clients configured to use Vertex)
How It Works
- Send your request to a passthrough endpoint (OpenAI or GenAI passthrough).
- The integration strips the passthrough prefix and forwards the remaining provider-native path/body.
- Bifrost handles provider execution through core inference and plugin pipelines.
- Response status, headers, and body are returned as passthrough output (for both stream and non-stream requests).
Provider Selection Rules
OpenAI Passthrough
- Uses
openaias the default provider.
GenAI Passthrough
- Uses
geminiby default. - Automatically switches to
vertexwhen Vertex patterns are detected, such as:- URL path containing
/projects/{PROJECT_ID}/locations/{LOCATION}/ - Request body
modelcontaining a Vertex resource path - OAuth token pattern typically used for Vertex (
Bearer ya29...)
- URL path containing
Usage Examples
OpenAI Passthrough
- Python SDK
- cURL
GenAI Passthrough (Gemini)
- Python SDK
- cURL
GenAI Passthrough (Vertex-style request)
- Python SDK
- cURL
Notes
- Use passthrough when you need a provider endpoint that is not directly supported by Bifrost integration routes yet.

