Translation Proxy
The translation proxy is the core of Claudex. It sits between Claude Code and your AI providers, transparently converting between the Anthropic Messages API and the OpenAI Chat Completions API (or Responses API).
How It Works
Section titled “How It Works”Claude Code → Anthropic Messages API request │ └── Claudex Proxy (127.0.0.1:13456) │ ├── DirectAnthropic provider → forward with headers │ ├── OpenAICompatible provider │ ├── Translate request: Anthropic → OpenAI Chat Completions │ ├── Apply query_params, strip_params, custom_headers │ ├── Forward to provider │ └── Translate response: OpenAI → Anthropic │ └── OpenAIResponses provider ├── Translate request: Anthropic → OpenAI Responses API ├── Forward to provider └── Translate response: Responses → AnthropicProvider Adapters
Section titled “Provider Adapters”Claudex uses a ProviderAdapter trait to handle differences between provider APIs. Three adapters are implemented:
| Adapter | Translation | Used By |
|---|---|---|
| DirectAnthropic | None (passthrough) | Anthropic, MiniMax, Vertex AI |
| ChatCompletions | Full Anthropic ↔ OpenAI translation | Grok, OpenAI, DeepSeek, Kimi, GLM, OpenRouter, Groq, Mistral, Together AI, Perplexity, Cerebras, Azure OpenAI, GitHub Copilot, GitLab Duo, Ollama, vLLM, LM Studio |
| Responses | Anthropic ↔ OpenAI Responses API | ChatGPT/Codex subscriptions |
What Gets Translated
Section titled “What Gets Translated”Request Translation (Anthropic → OpenAI)
Section titled “Request Translation (Anthropic → OpenAI)”| Anthropic | OpenAI |
|---|---|
system field | System message in messages array |
messages[].content blocks (text, image, tool_use, tool_result) | messages[].content + tool_calls |
tools array (JSON Schema with input_schema) | tools array (function format with parameters) |
tool_choice (auto, any, {name}) | tool_choice (auto, required, {function: {name}}) |
max_tokens | max_tokens (capped by max_tokens profile setting if set) |
temperature, top_p | Direct mapping (stripped if strip_params matches) |
Response Translation (OpenAI → Anthropic)
Section titled “Response Translation (OpenAI → Anthropic)”| OpenAI | Anthropic |
|---|---|
choices[0].message.content | content blocks (type: text) |
choices[0].message.tool_calls | content blocks (type: tool_use) |
finish_reason: stop | stop_reason: end_turn |
finish_reason: tool_calls | stop_reason: tool_use |
usage.prompt_tokens / completion_tokens | usage.input_tokens / output_tokens |
Tool Name Compatibility
Section titled “Tool Name Compatibility”Claude Code can generate tool names longer than 64 characters (e.g., mcp__server-name__very-long-tool-name-that-exceeds-the-limit). OpenAI and many providers enforce a 64-character limit.
Claudex automatically:
- Truncates names exceeding 64 characters in outgoing requests
- Builds a mapping table of truncated → original names
- Restores original names in provider responses
This roundtrip is fully transparent.
Streaming Translation
Section titled “Streaming Translation”Claudex fully supports SSE (Server-Sent Events) streaming, translating OpenAI stream chunks into Anthropic stream events in real time:
| OpenAI SSE | Anthropic SSE |
|---|---|
| First chunk | message_start + content_block_start |
choices[0].delta.content | content_block_delta (text_delta) |
choices[0].delta.tool_calls | content_block_delta (input_json_delta) |
finish_reason present | content_block_stop + message_delta + message_stop |
The streaming translator maintains a state machine to properly handle tool call accumulation and content block boundaries.
Azure OpenAI Support
Section titled “Azure OpenAI Support”Azure OpenAI uses a different authentication and URL scheme:
- Authentication:
api-keyheader instead ofAuthorization: Bearer - URL format:
https://{resource}.openai.azure.com/openai/deployments/{deployment} - API version: Required via
query_params
Claudex auto-detects Azure by checking if base_url contains openai.azure.com and adjusts authentication accordingly.
Supported Providers
Section titled “Supported Providers”| Provider | Type | Base URL |
|---|---|---|
| Anthropic | DirectAnthropic | https://api.anthropic.com |
| MiniMax | DirectAnthropic | https://api.minimax.io/anthropic |
| Google Vertex AI | DirectAnthropic | https://REGION-aiplatform.googleapis.com/v1/projects/... |
| OpenRouter | OpenAICompatible | https://openrouter.ai/api/v1 |
| Grok (xAI) | OpenAICompatible | https://api.x.ai/v1 |
| OpenAI | OpenAICompatible | https://api.openai.com/v1 |
| DeepSeek | OpenAICompatible | https://api.deepseek.com |
| Kimi/Moonshot | OpenAICompatible | https://api.moonshot.ai/v1 |
| GLM (Zhipu) | OpenAICompatible | https://api.z.ai/api/paas/v4 |
| Groq | OpenAICompatible | https://api.groq.com/openai/v1 |
| Mistral AI | OpenAICompatible | https://api.mistral.ai/v1 |
| Together AI | OpenAICompatible | https://api.together.xyz/v1 |
| Perplexity | OpenAICompatible | https://api.perplexity.ai |
| Cerebras | OpenAICompatible | https://api.cerebras.ai/v1 |
| Azure OpenAI | OpenAICompatible | https://{resource}.openai.azure.com/... |
| GitHub Copilot | OpenAICompatible | https://api.githubcopilot.com |
| GitLab Duo | OpenAICompatible | https://gitlab.com/api/v4/ai/llm/proxy |
| Ollama | OpenAICompatible | http://localhost:11434/v1 |
| vLLM | OpenAICompatible | http://localhost:8000/v1 |
| LM Studio | OpenAICompatible | http://localhost:1234/v1 |
| ChatGPT/Codex sub | OpenAIResponses | https://chatgpt.com/backend-api/codex |
Models Endpoint
Section titled “Models Endpoint”The proxy exposes a /v1/models endpoint that lists all enabled profiles. Each entry includes custom fields:
x-claudex-profile: profile namex-claudex-provider: provider type (anthropic,openai-compatible,openai-responses)
Claude Code queries this endpoint to discover available models.
Proxy Management
Section titled “Proxy Management”# Start proxy as a daemonclaudex proxy start -d
# Check proxy statusclaudex proxy status
# Stop proxy daemonclaudex proxy stop
# Start on a custom portclaudex proxy start -p 8080When you run claudex run <profile>, the proxy is automatically started in the background if not already running.