Skip to content

Translation Proxy

The translation proxy is the core of Claudex. It sits between Claude Code and your AI providers, transparently converting between the Anthropic Messages API and the OpenAI Chat Completions API.

Claude Code → Anthropic Messages API request
└── Claudex Proxy (127.0.0.1:13456)
├── DirectAnthropic provider → forward with headers
├── OpenAICompatible provider
│ ├── Translate request: Anthropic → OpenAI Chat Completions
│ ├── Forward to provider
│ └── Translate response: OpenAI → Anthropic
└── OpenAIResponses provider
├── Translate request: Anthropic → OpenAI Responses API
├── Forward to provider
└── Translate response: Responses → Anthropic

Request Translation (Anthropic → OpenAI)

Section titled “Request Translation (Anthropic → OpenAI)”
AnthropicOpenAI
system fieldSystem message in messages array
messages[].content blocks (text, image, tool_use)messages[].content + tool_calls
tools array (JSON Schema)tools array (function format)
tool_choicetool_choice
max_tokensmax_tokens
temperature, top_pDirect mapping

Response Translation (OpenAI → Anthropic)

Section titled “Response Translation (OpenAI → Anthropic)”
OpenAIAnthropic
choices[0].message.contentcontent blocks
choices[0].message.tool_callstool_use content blocks
finish_reason: stopstop_reason: end_turn
finish_reason: tool_callsstop_reason: tool_use
usage.prompt_tokens / completion_tokensusage.input_tokens / output_tokens

Claudex fully supports SSE (Server-Sent Events) streaming, translating OpenAI stream chunks into Anthropic stream events in real time:

OpenAI SSEAnthropic SSE
First chunkmessage_start + content_block_start
choices[0].delta.contentcontent_block_delta (text_delta)
choices[0].delta.tool_callscontent_block_delta (input_json_delta)
finish_reason presentcontent_block_stop + message_delta + message_stop

The streaming translator maintains a state machine to properly handle tool call accumulation and content block boundaries.

ProviderTypeBase URL
AnthropicDirectAnthropichttps://api.anthropic.com
MiniMaxDirectAnthropichttps://api.minimax.io/anthropic
OpenRouterOpenAICompatiblehttps://openrouter.ai/api/v1
Grok (xAI)OpenAICompatiblehttps://api.x.ai/v1
OpenAIOpenAICompatiblehttps://api.openai.com/v1
DeepSeekOpenAICompatiblehttps://api.deepseek.com
Kimi/MoonshotOpenAICompatiblehttps://api.moonshot.cn/v1
GLM (Zhipu)OpenAICompatiblehttps://open.bigmodel.cn/api/paas/v4
OllamaOpenAICompatiblehttp://localhost:11434/v1
vLLMOpenAICompatiblehttp://localhost:8000/v1
ChatGPT/Codex subOpenAIResponseshttps://chatgpt.com/backend-api/codex
Terminal window
# Start proxy as a daemon
claudex proxy start -d
# Check proxy status
claudex proxy status
# Stop proxy daemon
claudex proxy stop
# Start on a custom port
claudex proxy start -p 8080

When you run claudex run <profile>, the proxy is automatically started in the background if not already running.