Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.getbifrost.ai/llms.txt

Use this file to discover all available pages before exploring further.

Bifrost Gateway Installation

30-Second Setup

Get Bifrost running as a blazing-fast HTTP API gateway with zero configuration. Connect to any AI provider (OpenAI, Anthropic, Bedrock, and more) through a unified API that follows OpenAI request/response format.

1. Choose Your Setup Method

Both options work perfectly - choose what fits your workflow:

NPX Binary

# Install and run locally
npx -y @maximhq/bifrost

# Install a specific version
npx -y @maximhq/bifrost --transport-version v1.3.9

Docker

# Pull and run Bifrost HTTP API
docker pull maximhq/bifrost
docker run -p 8080:8080 maximhq/bifrost

# Pull a specific version
docker pull maximhq/bifrost:v1.3.9
docker pull maximhq/bifrost:v1.3.9-amd64
docker pull maximhq/bifrost:v1.3.9-arm64
For Data Persistence
# For configuration persistence across restarts
docker run -p 8080:8080 -v $(pwd)/data:/app/data maximhq/bifrost

2. Configuration Flags

FlagDefaultNPXDockerDescription
port8080-port 8080-e APP_PORT=8080 -p 8080:8080HTTP server port
hostlocalhost-host 0.0.0.0-e APP_HOST=0.0.0.0Host to bind server to
log-levelinfo-log-level info-e LOG_LEVEL=infoLog level (debug, info, warn, error)
log-stylejson-log-style json-e LOG_STYLE=jsonLog style (pretty, json)
Understanding App Directory The -app-dir flag determines where Bifrost stores all its data:
# Specify custom directory
npx -y @maximhq/bifrost -app-dir ./my-bifrost-data

# If not specified, creates in your OS config directory:
# • Linux/macOS: ~/.config/bifrost
# • Windows: %APPDATA%\bifrost
What’s stored in app-dir:
  • config.json - Configuration file (optional)
  • config.db - SQLite database for UI configuration
  • logs.db - Request logs database
Note: When using Bifrost via Docker, the volume you mount will be used as the app-dir.

3. Open the Web Interface

Navigate to http://localhost:8080 in your browser:
# macOS
open http://localhost:8080

# Linux
xdg-open http://localhost:8080

# Windows
start http://localhost:8080
🖥️ The Web UI provides:
  • Visual provider setup - Add API keys with clicks, not code
  • Real-time configuration - Changes apply immediately
  • Live monitoring - Request logs, metrics, and analytics
  • Governance management - Virtual keys, usage budgets, and more

4. Test Your First API Call

curl -X POST http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "openai/gpt-4o-mini",
    "messages": [{"role": "user", "content": "Hello, Bifrost!"}]
  }'
🎉 That’s it! Bifrost is running and ready to route AI requests.

What Just Happened?

  1. Zero Configuration Start: Bifrost launched without any config files - everything can be configured through the Web UI or API
  2. OpenAI-Compatible API: All Bifrost APIs follow OpenAI request/response format for seamless integration
  3. Unified API Endpoint: /v1/chat/completions works with any provider (OpenAI, Anthropic, Bedrock, etc.)
  4. Provider Resolution: openai/gpt-4o-mini tells Bifrost to use OpenAI’s GPT-4o Mini model. You can also use bare model names like gpt-4o-mini, Bifrost will automatically resolve the provider via the Model Catalog
  5. Automatic Routing: Bifrost handles authentication, rate limiting, and request routing automatically

Two Configuration Modes

Bifrost supports two configuration approaches - you cannot use both simultaneously:

Mode 1: Web UI Configuration

Configuration via UI When the UI is available:
  • No config.json file exists (Bifrost auto-creates SQLite database)
  • config.json exists with config_store configured

Mode 2: File-based Configuration

You can view entire config schema here
When to use: Advanced setups, GitOps workflows, or when UI is not needed Create config.json in your app directory:
{
  "$schema": "https://www.getbifrost.ai/schema",
  "client": {
    "drop_excess_requests": false
  },
  "providers": {
    "openai": {
      "keys": [
        {
          "name": "openai-key-1",
          "value": "env.OPENAI_API_KEY",
          "models": ["gpt-4o-mini", "gpt-4o"],
          "weight": 1.0
        }
      ]
    }
  },
  "config_store": {
    "enabled": true,
    "type": "sqlite",
    "config": {
      "path": "./config.db"
    }
  }
}
Without config_store in config.json:
  • UI is disabled - no real-time configuration possible
  • Read-only mode - config.json is never modified
  • Memory-only - all configurations loaded into memory at startup
  • Restart required - changes to config.json only apply after restart
With config_store in config.json:
  • UI is enabled - full real-time configuration via web interface
  • Database check - Bifrost checks if config store database exists and has data
    • Empty DB: Bootstraps database with config.json settings, then uses DB exclusively
    • Existing DB: Uses database directly, ignores config.json configurations
  • Persistent storage - all changes saved to database immediately
Important for Advanced Users: If you want database persistence but prefer not to use the UI, note that modifying config.json after initial bootstrap has no effect when config_store is enabled. Use the public HTTP APIs to make configuration changes instead. The Three Stores Explained:
  • Config Store: Stores provider configs, API keys, MCP settings - Required for UI functionality
  • Logs Store: Stores request logs shown in UI - Optional, can be disabled
  • Vector Store: Used for semantic caching - Optional, can be disabled

PostgreSQL UTF8 Requirement

The minimum PostgreSQL version required is 16 or above.
For the log store, Bifrost creates materialized views to improve analytics performance. Ensure that the PostgreSQL user has the necessary permissions to perform these operations on the target schema.
If you use PostgreSQL for config_store or logs_store, the target database must use UTF8 encoding. Use template0 when creating the database so PostgreSQL applies UTF8 and locale settings explicitly:
CREATE DATABASE bifrost
  WITH TEMPLATE template0
       ENCODING 'UTF8'
       LC_COLLATE '<your-locale>'
       LC_CTYPE '<your-locale>';
Use locale names that exist in your Postgres image/host (for example, en_US.UTF-8, C.UTF-8, or another installed UTF-8 locale). Verify the database encoding:
SELECT datname, pg_encoding_to_char(encoding) AS encoding
FROM pg_database
WHERE datname = 'bifrost';
If the database is not UTF8, Bifrost startup/migrations can fail with:
simple protocol queries must be run with client_encoding=UTF8
If you already created a SQL_ASCII database, create a new UTF8 database and update your Bifrost DB config to point to it.

Next Steps

Now that you have Bifrost running, explore these focused guides:

Essential Topics

Advanced Topics

  • Tracing - Logging requests for monitoring and debugging
  • MCP Tools - Enable AI models to use external tools (filesystem, web search, databases)
  • Governance - Usage tracking, rate limiting, and cost control
  • Deployment - Production setup and scaling

Happy building with Bifrost! 🚀