Skip to main content
- chore: update core version to 1.2.23 and framework version to 1.1.28
- feat: added unified streaming lifecycle events across all providers to fully align with OpenAI’s streaming response types.
- chore: shift from
alpha/responses to v1/responses in openrouter provider for responses API
- feat: send back pricing data for models in list models response
- fix: add support for keyless providers in list models request
- feat: add support for custom fine-tuned models in vertex provider
- feat: send deployment aliases in list models response for supported providers
- feat: support for API Key auth in vertex provider
- feat: support for system account in environment for vertex provider
- feat: added unified streaming lifecycle events across all providers to fully align with OpenAI’s streaming response types.
- chore: shift from
alpha/responses to v1/responses in openrouter provider for responses API
- fix: add support for keyless providers in list models request
- feat: add support for custom fine-tuned models in vertex provider
- fix: vertex provider list models now correctly returns the custom fine-tuned model ids in the response
- feat: send deployment aliases in list models response for supported providers
- feat: support for API Key auth in vertex provider
- chore: update core version to 1.2.23
- feat: expose method to get pricing data for a model in model catalog
- feat: add project number and deployments to vertex key config
- chore: update core version to 1.2.23 and framework version to 1.1.28
- chore: update core version to 1.2.23 and framework version to 1.1.28
- chore: update core version to 1.2.23 and framework version to 1.1.28
- chore: update core version to 1.2.23 and framework version to 1.1.28
- chore: update core version to 1.2.23 and framework version to 1.1.28
- chore: update core version to 1.2.23 and framework version to 1.1.28
- chore: update core version to 1.2.23 and framework version to 1.1.28
- chore: update core version to 1.2.23 and framework version to 1.1.28