Setup
1. Install Roo Code
Install the Roo Code extension from the VS Code marketplace.2. Create an API Configuration Profile
- Open Settings (click the gear icon) → Providers
- Click the + button to create a new profile
- Select OpenAI Compatible as the provider
- Configure the following:
| Field | Value |
|---|---|
| Base URL | http://localhost:8080/openai (or your Bifrost host, e.g. https://bifrost.yourcompany.com/openai) |
| API Key | Your Bifrost virtual key if authentication is enabled; otherwise use dummy or leave empty |
| Model | Bifrost model ID in provider/model format (e.g. anthropic/claude-sonnet-4-5-20250929, openai/gpt-5) |

3. Verify the Connection
Ask Roo which model it’s using — it should respond with the Bifrost model ID you configured (e.g.anthropic/claude-sonnet-4-5).

Virtual Keys
When Bifrost has virtual key authentication enabled, set API Key in your Roo Code profile to your virtual key. This lets you enforce usage limits, budgets, and access control per user or team. For team deployments, create a separate configuration profile for each team — each can use a different virtual key with its own rate limits, budgets, and provider access rules configured in the Bifrost dashboard.Model Selection
Roo Code lets you assign models per mode (Code, Ask, Architect, Debug, Orchestrator). Use Bifrost model IDs inprovider/model format to access any configured provider:
- Use powerful models like
openai/gpt-5oranthropic/claude-sonnet-4-5-20250929for complex coding tasks - Use fast models like
groq/llama-3.3-70b-versatilefor quick completions - Link different profiles to different modes in the Prompts tab for optimal cost and performance
Using Multiple Providers
Bifrost routes requests to the correct provider based on the model name. Use theprovider/model-name format to access any configured provider through the single OpenAI-compatible endpoint:
Supported Providers
Bifrost supports the following providers with theprovider/model-name format:
openai, azure, gemini, vertex, bedrock, mistral, groq, cerebras, cohere, perplexity, xai, ollama, openrouter, huggingface, nebius, parasail, replicate, vllm, sgl
Roo Code connects to Bifrost via a single OpenAI-compatible endpoint. Bifrost handles routing to the correct provider based on the model name — no per-provider configuration needed.
MCP Server Integration
Roo Code supports MCP (Model Context Protocol). You can connect it to Bifrost’s MCP server to access all tools configured in Bifrost. See MCP Gateway URL for setup instructions.Observability
All Roo Code traffic through Bifrost is logged. Monitor it athttp://localhost:8080/logs — filter by provider, model, or search through conversation content to track usage.
Next Steps
- Provider Configuration — Configure AI providers in Bifrost
- Virtual Keys — Set up usage limits and access control

