Skip to main content
POST
/
anthropic
/
v1
/
messages
/
count_tokens
Count tokens (Anthropic format)
curl --request POST \
  --url http://localhost:8080/anthropic/v1/messages/count_tokens \
  --header 'Content-Type: application/json' \
  --data '
{
  "model": "claude-3-opus-20240229",
  "max_tokens": 123,
  "messages": [
    {
      "role": "user",
      "content": "<string>"
    }
  ],
  "system": "<string>",
  "metadata": {
    "user_id": "<string>"
  },
  "stream": true,
  "temperature": 0.5,
  "top_p": 123,
  "top_k": 123,
  "stop_sequences": [
    "<string>"
  ],
  "tools": [
    {
      "type": "custom",
      "name": "<string>",
      "description": "<string>",
      "input_schema": {},
      "cache_control": {
        "type": "ephemeral",
        "ttl": "<string>"
      },
      "display_width_px": 123,
      "display_height_px": 123,
      "display_number": 123,
      "enable_zoom": true,
      "max_uses": 123,
      "allowed_domains": [
        "<string>"
      ],
      "blocked_domains": [
        "<string>"
      ],
      "user_location": {
        "type": "approximate",
        "city": "<string>",
        "country": "<string>",
        "timezone": "<string>"
      }
    }
  ],
  "tool_choice": {
    "type": "auto",
    "name": "<string>",
    "disable_parallel_tool_use": true
  },
  "mcp_servers": [
    {
      "type": "<string>",
      "name": "<string>",
      "url": "<string>",
      "authorization_token": "<string>",
      "tool_configuration": {
        "enabled": true,
        "allowed_tools": [
          "<string>"
        ]
      }
    }
  ],
  "thinking": {
    "type": "enabled",
    "budget_tokens": 123
  },
  "output_format": {},
  "fallbacks": [
    "<string>"
  ]
}
'
{
  "input_tokens": 123
}

Body

application/json
model
string
required

Model identifier (e.g., claude-3-opus-20240229)

Example:

"claude-3-opus-20240229"

max_tokens
integer
required

Maximum tokens to generate

messages
object[]
required

List of messages in the conversation

system

System prompt

metadata
object
stream
boolean

Whether to stream the response

temperature
number
Required range: 0 <= x <= 1
top_p
number
top_k
integer
stop_sequences
string[]
tools
object[]
tool_choice
object
mcp_servers
object[]

MCP servers configuration (requires beta header)

thinking
object
output_format
object

Structured output format (requires beta header)

fallbacks
string[]

Response

Successful response

input_tokens
integer

Number of input tokens