Skip to content

Provider Setup Guide

This guide covers detailed setup instructions for every provider supported by Claudex, including API key providers, OAuth subscriptions, cloud platforms, and local models.

  1. Sign up at console.anthropic.com
  2. Navigate to API Keys and create a new key (format: sk-ant-*)
[[profiles]]
name = "anthropic"
provider_type = "DirectAnthropic"
base_url = "https://api.anthropic.com"
api_key = "sk-ant-..."
default_model = "claude-sonnet-4-20250514"
Terminal window
claudex profile test anthropic
  1. Sign up at platform.minimaxi.com
  2. Navigate to API Keys in the console
[[profiles]]
name = "minimax"
provider_type = "DirectAnthropic"
base_url = "https://api.minimax.io/anthropic"
api_key = "..."
default_model = "claude-sonnet-4-20250514"
Terminal window
claudex profile test minimax
  1. Sign up at openrouter.ai
  2. Get your API key at openrouter.ai/keys
[[profiles]]
name = "openrouter"
provider_type = "OpenAICompatible"
base_url = "https://openrouter.ai/api/v1"
api_key = "sk-or-..."
default_model = "anthropic/claude-sonnet-4"
Terminal window
claudex profile test openrouter
  1. Sign up at console.x.ai
  2. Create an API key in the console
[[profiles]]
name = "grok"
provider_type = "OpenAICompatible"
base_url = "https://api.x.ai/v1"
api_key = "xai-..."
default_model = "grok-3-beta"
Terminal window
claudex profile test grok
  1. Sign up at platform.openai.com
  2. Create an API key at platform.openai.com/api-keys
[[profiles]]
name = "chatgpt"
provider_type = "OpenAICompatible"
base_url = "https://api.openai.com/v1"
api_key = "sk-..."
default_model = "gpt-4o"
Terminal window
claudex profile test chatgpt
  1. Sign up at platform.deepseek.com
  2. Create an API key in the console
[[profiles]]
name = "deepseek"
provider_type = "OpenAICompatible"
base_url = "https://api.deepseek.com"
api_key = "sk-..."
default_model = "deepseek-chat"
Terminal window
claudex profile test deepseek
  1. Sign up at platform.moonshot.cn
  2. Create an API key in the console
[[profiles]]
name = "kimi"
provider_type = "OpenAICompatible"
base_url = "https://api.moonshot.ai/v1"
api_key = "sk-..."
default_model = "kimi-k2-0905-preview"
Terminal window
claudex profile test kimi
  1. Sign up at open.bigmodel.cn
  2. Create an API key in the console
[[profiles]]
name = "glm"
provider_type = "OpenAICompatible"
base_url = "https://api.z.ai/api/paas/v4"
api_key = "..."
default_model = "glm-4.6"
Terminal window
claudex profile test glm
  1. Sign up at console.groq.com
  2. Create an API key in Settings > API Keys
[[profiles]]
name = "groq"
provider_type = "OpenAICompatible"
base_url = "https://api.groq.com/openai/v1"
api_key = "gsk_..."
default_model = "llama-3.3-70b-versatile"
Terminal window
claudex profile test groq
  1. Sign up at console.mistral.ai
  2. Create an API key in the console
[[profiles]]
name = "mistral"
provider_type = "OpenAICompatible"
base_url = "https://api.mistral.ai/v1"
api_key = "..."
default_model = "mistral-large-latest"
Terminal window
claudex profile test mistral
  1. Sign up at api.together.ai
  2. Create an API key in the dashboard
[[profiles]]
name = "together"
provider_type = "OpenAICompatible"
base_url = "https://api.together.xyz/v1"
api_key = "..."
default_model = "meta-llama/Llama-3.3-70B-Instruct-Turbo"
Terminal window
claudex profile test together
  1. Sign up at perplexity.ai
  2. Create an API key at perplexity.ai/settings/api
[[profiles]]
name = "perplexity"
provider_type = "OpenAICompatible"
base_url = "https://api.perplexity.ai"
api_key = "pplx-..."
default_model = "sonar-pro"
Terminal window
claudex profile test perplexity
  1. Sign up at cloud.cerebras.ai
  2. Create an API key in the dashboard
[[profiles]]
name = "cerebras"
provider_type = "OpenAICompatible"
base_url = "https://api.cerebras.ai/v1"
api_key = "..."
default_model = "llama-3.3-70b"
Terminal window
claudex profile test cerebras
  1. Create an Azure OpenAI resource in the Azure Portal
  2. Deploy a model and note the resource name and deployment name
  3. Get your API key from Keys and Endpoint
[[profiles]]
name = "azure-openai"
provider_type = "OpenAICompatible"
base_url = "https://YOUR_RESOURCE.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT"
api_key = "YOUR_AZURE_KEY"
default_model = "gpt-4o"
[profiles.query_params]
api-version = "2024-12-01-preview"
Terminal window
claudex profile test azure-openai
  1. Enable the Vertex AI API in your GCP project
  2. Generate an access token with gcloud auth print-access-token
[[profiles]]
name = "vertex-ai"
provider_type = "DirectAnthropic"
base_url = "https://us-east5-aiplatform.googleapis.com/v1/projects/YOUR_PROJECT/locations/us-east5/publishers/anthropic/models"
api_key = "YOUR_GCLOUD_TOKEN"
default_model = "claude-sonnet-4@20250514"
Terminal window
claudex profile test vertex-ai

AWS Bedrock is supported through a LiteLLM proxy:

  1. Install LiteLLM: pip install litellm
  2. Start the proxy:
Terminal window
litellm --model bedrock/anthropic.claude-sonnet-4-20250514-v2:0
  1. Configure Claudex:
[[profiles]]
name = "bedrock"
provider_type = "OpenAICompatible"
base_url = "http://localhost:4000/v1"
api_key = "sk-litellm"
default_model = "bedrock/anthropic.claude-sonnet-4-20250514-v2:0"

OAuth authentication lets you use existing provider subscriptions (ChatGPT Plus, Claude Max, etc.) instead of separate API keys.

Use your existing Claude subscription through Claude Code’s native OAuth session.

Prerequisites: Claude Code installed and logged in (claude command works normally)

[[profiles]]
name = "claude-max"
provider_type = "DirectAnthropic"
base_url = "https://api.claude.ai"
default_model = "claude-sonnet-4-20250514"
auth_type = "oauth"
oauth_provider = "claude"
[profiles.models]
haiku = "claude-haiku-4-20250514"
sonnet = "claude-sonnet-4-20250514"
opus = "claude-opus-4-20250514"

Use your ChatGPT Plus or Pro subscription via the Codex CLI token.

Prerequisites: Install Codex CLI and authenticate:

Terminal window
# Install Codex CLI
npm install -g @openai/codex
# Authenticate (opens browser)
codex auth

Then configure Claudex:

[[profiles]]
name = "codex-sub"
provider_type = "OpenAIResponses"
base_url = "https://chatgpt.com/backend-api/codex"
default_model = "gpt-5.3-codex"
auth_type = "oauth"
oauth_provider = "openai"
[profiles.models]
haiku = "codex-mini-latest"
sonnet = "gpt-5.3-codex"
opus = "gpt-5.3-codex"
Terminal window
# Read token from Codex CLI
claudex auth login openai --profile codex-sub
# Verify
claudex auth status
# Run
claudex run codex-sub

Use your Google account via Gemini CLI credentials.

[[profiles]]
name = "gemini-sub"
provider_type = "OpenAICompatible"
base_url = "https://generativelanguage.googleapis.com/v1beta/openai"
default_model = "gemini-2.5-pro"
auth_type = "oauth"
oauth_provider = "google"
[profiles.models]
haiku = "gemini-2.0-flash"
sonnet = "gemini-2.5-pro"
opus = "gemini-2.5-pro"
Terminal window
# Login reads from Gemini CLI credentials
claudex auth login google --profile gemini-sub
# Verify
claudex auth status

Use your Qwen account via OAuth Device Code flow.

[[profiles]]
name = "qwen-oauth"
provider_type = "OpenAICompatible"
base_url = "https://chat.qwen.ai/api"
default_model = "qwen3-235b-a22b"
auth_type = "oauth"
oauth_provider = "qwen"
Terminal window
# Start device code flow
claudex auth login qwen --profile qwen-oauth
# Verify
claudex auth status

Use your Kimi account via Kimi CLI credentials.

[[profiles]]
name = "kimi-oauth"
provider_type = "OpenAICompatible"
base_url = "https://api.moonshot.cn/v1"
default_model = "moonshot-v1-128k"
auth_type = "oauth"
oauth_provider = "kimi"
Terminal window
# Login reads from Kimi CLI credentials
claudex auth login kimi --profile kimi-oauth
# Verify
claudex auth status

Use your GitHub Copilot subscription via OAuth Device Code flow.

[[profiles]]
name = "copilot"
provider_type = "OpenAICompatible"
base_url = "https://api.githubcopilot.com"
default_model = "gpt-4o"
auth_type = "oauth"
oauth_provider = "github"
Terminal window
# Start device code flow (opens browser for GitHub login)
claudex auth login github --profile copilot
# Verify
claudex auth status

Use your GitLab Duo subscription via a Personal Access Token.

[[profiles]]
name = "gitlab-duo"
provider_type = "OpenAICompatible"
base_url = "https://gitlab.com/api/v4/ai/llm/proxy"
default_model = "claude-sonnet-4-20250514"
auth_type = "oauth"
oauth_provider = "gitlab"
Terminal window
# Set your GitLab token as an environment variable
export GITLAB_TOKEN=glpat-...
# Login
claudex auth login gitlab --profile gitlab-duo
# Verify
claudex auth status
  1. Install Ollama: ollama.com
  2. Pull a model:
Terminal window
ollama pull qwen2.5:72b
  1. Configure Claudex:
[[profiles]]
name = "local-qwen"
provider_type = "OpenAICompatible"
base_url = "http://localhost:11434/v1"
api_key = ""
default_model = "qwen2.5:72b"
Terminal window
claudex profile test local-qwen
  1. Install and start vLLM:
Terminal window
pip install vllm
vllm serve meta-llama/Llama-3.3-70B-Instruct --port 8000
  1. Configure Claudex:
[[profiles]]
name = "local-llama"
provider_type = "OpenAICompatible"
base_url = "http://localhost:8000/v1"
api_key = ""
default_model = "meta-llama/Llama-3.3-70B-Instruct"
  1. Download LM Studio from lmstudio.ai
  2. Load a model and start the local server (default port: 1234)
[[profiles]]
name = "lm-studio"
provider_type = "OpenAICompatible"
base_url = "http://localhost:1234/v1"
api_key = "lm-studio"
default_model = "local-model"

After configuring any provider, verify connectivity:

Terminal window
# Test a specific profile
claudex profile test <profile-name>
# Test all profiles
claudex profile test all
# List all configured profiles
claudex profile list