Try Bifrost Enterprise free for 14 days. Explore now
cURL
curl --request GET \ --url http://localhost:8080/openai/v1/files
{ "object": "<string>", "data": [ { "id": "<string>", "object": "<string>", "bytes": 123, "created_at": 123, "filename": "<string>", "purpose": "batch", "status": "uploaded", "status_details": "<string>", "expires_at": 123 } ], "has_more": true, "after": "<string>", "extra_fields": { "request_type": "<string>", "provider": "openai", "model_requested": "<string>", "model_deployment": "<string>", "latency": 123, "chunk_index": 123, "raw_request": {}, "raw_response": {}, "cache_debug": { "cache_hit": true, "cache_id": "<string>", "hit_type": "<string>", "provider_used": "<string>", "model_used": "<string>", "input_tokens": 123, "threshold": 123, "similarity": 123 } } }
Lists uploaded files.
Note: This endpoint also works without the /v1 prefix (e.g., /openai/files).
/v1
/openai/files
Filter by purpose
Maximum files to return
Cursor for pagination
asc
desc
Filter by provider
Successful response
Show child attributes
Additional fields included in responses
Was this page helpful?