Codex CLI provides powerful code generation and completion capabilities directly in your terminal.
To install Codex CLI
npm install -g @openai/codex
Configuring Codex CLI to work with Bifrost
Codex CLI supports multiple authentication methods. Choose the one that matches your account type.
ChatGPT account (OAuth)
If you have a ChatGPT Plus, Pro, Team, Enterprise, or Edu subscription, Codex CLI authenticates via browser-based OAuth — no API key needed.
-
Set the Bifrost base URL
export OPENAI_BASE_URL=http://localhost:8080/openai
-
Run Codex and sign in
Select Sign in with ChatGPT and authenticate via your browser. All traffic automatically routes through Bifrost.
API key based usage
For users with OpenAI API keys or Bifrost virtual keys:
-
Configure environment variables
export OPENAI_API_KEY=your-api-key # OpenAI API key or Bifrost virtual key
export OPENAI_BASE_URL=http://localhost:8080/openai
-
Run Codex
Now all Codex CLI traffic flows through Bifrost, giving you access to any provider/model configured in your Bifrost setup, plus observability and governance.
Model Configuration
Use the --model flag to start Codex with a specific model:
codex --model gpt-5-codex
codex --model gpt-5.4-pro
Switch models mid-session with the /model command:
/model gpt-5.4-pro
/model gpt-5-codex
Using Non-OpenAI Models with Codex CLI
Bifrost automatically translates OpenAI API requests to other providers, so you can use Codex CLI with models from Anthropic, Google, Mistral, and more. Use the provider/model-name format to specify any Bifrost-configured model.
# Start with an Anthropic model
codex --model anthropic/claude-sonnet-4-5-20250929
# Start with a Google model
codex --model gemini/gemini-2.5-pro
# Switch mid-session
/model anthropic/claude-sonnet-4-5-20250929
/model mistral/mistral-large-latest
Supported Providers
Bifrost supports the following providers with the provider/model-name format:
openai, azure, gemini, vertex, bedrock, mistral, groq, cerebras, cohere, perplexity, xai, ollama, openrouter, huggingface, nebius, parasail, replicate, vllm, sgl
Non-OpenAI models must support tool use for Codex CLI to work properly. Codex CLI relies on tool calling for file operations, terminal commands, and code editing. Models without tool use support will fail on most operations.