Try Bifrost Enterprise free for 14 days. Explore now
cURL
curl --request POST \ --url http://localhost:8080/openai/v1/batches/{batch_id}/cancel
{ "id": "<string>", "object": "<string>", "status": "validating", "request_counts": { "total": 123, "completed": 123, "failed": 123, "succeeded": 123, "expired": 123, "canceled": 123, "pending": 123 }, "cancelling_at": 123, "cancelled_at": 123, "extra_fields": { "request_type": "<string>", "provider": "openai", "model_requested": "<string>", "model_deployment": "<string>", "latency": 123, "chunk_index": 123, "raw_request": {}, "raw_response": {}, "cache_debug": { "cache_hit": true, "cache_id": "<string>", "hit_type": "<string>", "provider_used": "<string>", "model_used": "<string>", "input_tokens": 123, "threshold": 123, "similarity": 123 } } }
Cancels a batch processing job.
Note: This endpoint also works without the /v1 prefix (e.g., /openai/batches/{batch_id}/cancel).
/v1
/openai/batches/{batch_id}/cancel
Batch job ID to cancel
Provider for the batch
Successful response
validating
failed
in_progress
finalizing
completed
expired
cancelling
canceled
ended
Show child attributes
Additional fields included in responses
Was this page helpful?