跳到內容

設定參考

Claudex 使用 figment 進行分層設定。來源按以下順序合併(後者覆寫前者):

  1. 程式內建預設值
  2. 全域設定~/.config/claudex/config.toml.yaml
  3. 專案設定(CWD 或父目錄中的 claudex.toml/.claudex/config.toml,或 $CLAUDEX_CONFIG
  4. 環境變數CLAUDEX_ 前綴,__ 分隔符)

同時支援 TOML 和 YAML 格式。

詳見設定

# claude 二進位檔路徑(預設:PATH 中的 "claude")
claude_binary = "claude"
# 代理伺服器繫結連接埠
proxy_port = 13456
# 代理伺服器繫結位址
proxy_host = "127.0.0.1"
# 日誌層級:trace, debug, info, warn, error
log_level = "info"
# 終端超連結 (OSC 8):"auto" | true | false
hyperlinks = "auto"
欄位類型預設值說明
claude_binarystring"claude"Claude Code CLI 二進位檔路徑
proxy_portinteger13456翻譯代理監聽的連接埠
proxy_hoststring"127.0.0.1"代理繫結的位址
log_levelstring"info"最低日誌層級
hyperlinksstring/bool"auto"終端超連結:"auto"(偵測)、true(強制開啟)、false(強制關閉)

定義模型識別符的簡稱:

[model_aliases]
grok3 = "grok-3-beta"
gpt4o = "gpt-4o"
ds3 = "deepseek-chat"
claude = "claude-sonnet-4-20250514"

使用別名搭配 -m

Terminal window
claudex run grok -m grok3
[[profiles]]
name = "grok"
provider_type = "OpenAICompatible"
base_url = "https://api.x.ai/v1"
api_key = "xai-..."
# api_key_keyring = "grok-api-key"
default_model = "grok-3-beta"
auth_type = "api-key" # "api-key"(預設)或 "oauth"
# oauth_provider = "openai" # auth_type = "oauth" 時必填
backup_providers = ["deepseek"]
custom_headers = {}
extra_env = {}
priority = 100
enabled = true
max_tokens = 16384 # 選用:限制輸出 token 數
strip_params = "auto" # "auto" | "none" | ["temperature", "top_p"]
# URL 查詢參數(例如 Azure api-version)
[profiles.query_params]
# api-version = "2024-12-01-preview"
# 模型 slot 映射(選用)
[profiles.models]
haiku = "grok-3-mini-beta"
sonnet = "grok-3-beta"
opus = "grok-3-beta"
欄位類型預設值說明
namestring必填唯一 profile 識別名稱
provider_typestring"DirectAnthropic""DirectAnthropic""OpenAICompatible""OpenAIResponses"
base_urlstring必填提供商 API 端點 URL
api_keystring""明文 API 金鑰
api_key_keyringstring作業系統金鑰鏈項目名稱(覆寫 api_key
default_modelstring必填預設使用的模型識別符
auth_typestring"api-key"認證方式:"api-key""oauth"
oauth_providerstringOAuth 提供商名稱。可選:claudeopenaigoogleqwenkimigithubgitlab
backup_providersstring[][]故障轉移的 profile 名稱,按順序嘗試
custom_headersmap{}每次請求附帶的額外 HTTP 標頭
extra_envmap{}啟動 Claude 時設定的環境變數
priorityinteger100智慧路由的優先權權重(越高越優先)
enabledbooleantrue此 profile 是否啟用
max_tokensinteger傳送給提供商的最大輸出 token 數上限。設定時會覆寫請求中的 max_tokens
strip_paramsstring/array"auto"從請求中移除的參數。"auto" 偵測已知端點;"none" 傳送全部;陣列移除特定參數
[profiles.query_params]
api-version = "2024-12-01-preview"
欄位類型說明
query_paramsmap附加到每個請求 URL 的鍵值對

主要用於 Azure OpenAI(api-version),但適用於任何提供商。

選用的 [profiles.models] 表將 Claude Code 的 /model 切換器 slot 映射到提供商特定的模型名稱。在 Claude Code 中切換模型時(例如 /model opus),Claudex 會將請求翻譯為映射的模型。

[profiles.models]
haiku = "grok-3-mini-beta" # 映射 /model haiku
sonnet = "grok-3-beta" # 映射 /model sonnet
opus = "grok-3-beta" # 映射 /model opus
欄位類型說明
haikustringClaude Code 選取 haiku 時使用的模型
sonnetstringClaude Code 選取 sonnet 時使用的模型
opusstringClaude Code 選取 opus 時使用的模型
# Anthropic(DirectAnthropic — 不需翻譯)
[[profiles]]
name = "anthropic"
provider_type = "DirectAnthropic"
base_url = "https://api.anthropic.com"
api_key = "sk-ant-..."
default_model = "claude-sonnet-4-20250514"
# MiniMax(DirectAnthropic — 不需翻譯)
[[profiles]]
name = "minimax"
provider_type = "DirectAnthropic"
base_url = "https://api.minimax.io/anthropic"
api_key = "..."
default_model = "claude-sonnet-4-20250514"
backup_providers = ["anthropic"]
# OpenRouter(OpenAICompatible — 需要翻譯)
[[profiles]]
name = "openrouter"
provider_type = "OpenAICompatible"
base_url = "https://openrouter.ai/api/v1"
api_key = "..."
default_model = "anthropic/claude-sonnet-4"
# Grok(OpenAICompatible — 需要翻譯)
[[profiles]]
name = "grok"
provider_type = "OpenAICompatible"
base_url = "https://api.x.ai/v1"
api_key = "xai-..."
default_model = "grok-3-beta"
backup_providers = ["deepseek"]
# OpenAI(OpenAICompatible — 需要翻譯)
[[profiles]]
name = "chatgpt"
provider_type = "OpenAICompatible"
base_url = "https://api.openai.com/v1"
api_key = "sk-..."
default_model = "gpt-4o"
# DeepSeek(OpenAICompatible — 需要翻譯)
[[profiles]]
name = "deepseek"
provider_type = "OpenAICompatible"
base_url = "https://api.deepseek.com"
api_key = "..."
default_model = "deepseek-chat"
backup_providers = ["grok"]
# Kimi / Moonshot(OpenAICompatible — 需要翻譯)
[[profiles]]
name = "kimi"
provider_type = "OpenAICompatible"
base_url = "https://api.moonshot.ai/v1"
api_key = "..."
default_model = "kimi-k2-0905-preview"
# GLM / 智譜(OpenAICompatible — 需要翻譯)
[[profiles]]
name = "glm"
provider_type = "OpenAICompatible"
base_url = "https://api.z.ai/api/paas/v4"
api_key = "..."
default_model = "glm-4.6"
# Groq(OpenAICompatible — 快速推論)
[[profiles]]
name = "groq"
provider_type = "OpenAICompatible"
base_url = "https://api.groq.com/openai/v1"
api_key = "gsk_..."
default_model = "llama-3.3-70b-versatile"
# Mistral AI(OpenAICompatible — 需要翻譯)
[[profiles]]
name = "mistral"
provider_type = "OpenAICompatible"
base_url = "https://api.mistral.ai/v1"
api_key = "..."
default_model = "mistral-large-latest"
# Together AI(OpenAICompatible — 需要翻譯)
[[profiles]]
name = "together"
provider_type = "OpenAICompatible"
base_url = "https://api.together.xyz/v1"
api_key = "..."
default_model = "meta-llama/Llama-3.3-70B-Instruct-Turbo"
# Perplexity(OpenAICompatible — 線上搜尋 + LLM)
[[profiles]]
name = "perplexity"
provider_type = "OpenAICompatible"
base_url = "https://api.perplexity.ai"
api_key = "pplx-..."
default_model = "sonar-pro"
# Cerebras(OpenAICompatible — 快速推論)
[[profiles]]
name = "cerebras"
provider_type = "OpenAICompatible"
base_url = "https://api.cerebras.ai/v1"
api_key = "..."
default_model = "llama-3.3-70b"
# Azure OpenAI(OpenAICompatible — api-key 標頭 + query_params)
[[profiles]]
name = "azure-openai"
provider_type = "OpenAICompatible"
base_url = "https://YOUR_RESOURCE.openai.azure.com/openai/deployments/YOUR_DEPLOYMENT"
api_key = "YOUR_AZURE_KEY"
default_model = "gpt-4o"
[profiles.query_params]
api-version = "2024-12-01-preview"
# Google Vertex AI(DirectAnthropic)
[[profiles]]
name = "vertex-ai"
provider_type = "DirectAnthropic"
base_url = "https://us-east5-aiplatform.googleapis.com/v1/projects/YOUR_PROJECT/locations/us-east5/publishers/anthropic/models"
api_key = "YOUR_GCLOUD_TOKEN"
default_model = "claude-sonnet-4@20250514"
# Ollama(本地,不需 API 金鑰)
[[profiles]]
name = "local-qwen"
provider_type = "OpenAICompatible"
base_url = "http://localhost:11434/v1"
api_key = ""
default_model = "qwen2.5:72b"
enabled = false
# vLLM / LM Studio(本地)
[[profiles]]
name = "local-llama"
provider_type = "OpenAICompatible"
base_url = "http://localhost:8000/v1"
api_key = ""
default_model = "llama-3.3-70b"
enabled = false
# Claude Max(跳過代理,使用 Claude 從 ~/.claude 讀取的原生 OAuth)
[[profiles]]
name = "claude-max"
provider_type = "DirectAnthropic"
base_url = "https://api.claude.ai"
default_model = "claude-sonnet-4-20250514"
auth_type = "oauth"
oauth_provider = "claude"
[profiles.models]
haiku = "claude-haiku-4-20250514"
sonnet = "claude-sonnet-4-20250514"
opus = "claude-opus-4-20250514"
# ChatGPT/Codex 訂閱(OpenAIResponses)
[[profiles]]
name = "codex-sub"
provider_type = "OpenAIResponses"
base_url = "https://chatgpt.com/backend-api/codex"
default_model = "gpt-5.3-codex"
auth_type = "oauth"
oauth_provider = "openai"
[profiles.models]
haiku = "codex-mini-latest"
sonnet = "gpt-5.3-codex"
opus = "gpt-5.3-codex"
# 透過 OAuth 使用 Google Gemini
[[profiles]]
name = "gemini-sub"
provider_type = "OpenAICompatible"
base_url = "https://generativelanguage.googleapis.com/v1beta/openai"
default_model = "gemini-2.5-pro"
auth_type = "oauth"
oauth_provider = "google"
# 透過 OAuth 使用 Kimi
[[profiles]]
name = "kimi-oauth"
provider_type = "OpenAICompatible"
base_url = "https://api.moonshot.cn/v1"
default_model = "moonshot-v1-128k"
auth_type = "oauth"
oauth_provider = "kimi"
# 透過 OAuth 使用 Qwen
[[profiles]]
name = "qwen-oauth"
provider_type = "OpenAICompatible"
base_url = "https://chat.qwen.ai/api"
default_model = "qwen3-235b-a22b"
auth_type = "oauth"
oauth_provider = "qwen"
# 透過 OAuth 使用 GitHub Copilot
[[profiles]]
name = "copilot"
provider_type = "OpenAICompatible"
base_url = "https://api.githubcopilot.com"
default_model = "gpt-4o"
auth_type = "oauth"
oauth_provider = "github"
# 透過 GITLAB_TOKEN 使用 GitLab Duo
[[profiles]]
name = "gitlab-duo"
provider_type = "OpenAICompatible"
base_url = "https://gitlab.com/api/v4/ai/llm/proxy"
default_model = "claude-sonnet-4-20250514"
auth_type = "oauth"
oauth_provider = "gitlab"
[router]
enabled = false
profile = "local-qwen" # 重用某個 profile 的 base_url + api_key
model = "qwen2.5:3b" # 覆寫模型(選用)
欄位類型預設值說明
enabledbooleanfalse啟用智慧路由
profilestring""用於分類的 profile 名稱(使用其 base_url + api_key
modelstring""分類用的模型覆寫(預設使用 profile 的 default_model
[router.rules]
code = "deepseek"
analysis = "grok"
creative = "chatgpt"
search = "kimi"
math = "deepseek"
default = "grok"
說明
code程式設計任務的 profile
analysis分析和推理任務的 profile
creative創意寫作的 profile
search搜尋和研究的 profile
math數學和邏輯的 profile
default意圖未分類時的回退
[context.compression]
enabled = false
threshold_tokens = 50000
keep_recent = 10
profile = "local-qwen" # 重用某個 profile 的 base_url + api_key
model = "qwen2.5:3b" # 覆寫模型(選用)
欄位類型預設值說明
enabledbooleanfalse啟用對話壓縮
threshold_tokensinteger50000token 數超過此值時壓縮
keep_recentinteger10始終保留最後 N 條訊息不壓縮
profilestring""用於摘要的 profile 名稱
modelstring""摘要用的模型覆寫
[context.sharing]
enabled = false
max_context_size = 2000
欄位類型預設值說明
enabledbooleanfalse啟用跨 profile 上下文共享
max_context_sizeinteger2000從其他 profile 注入的最大 token 數
[context.rag]
enabled = false
index_paths = ["./src", "./docs"]
profile = "local-qwen" # 重用某個 profile 的 base_url + api_key
model = "nomic-embed-text" # 嵌入模型
chunk_size = 512
top_k = 5
欄位類型預設值說明
enabledbooleanfalse啟用本地 RAG
index_pathsstring[][]要索引的目錄
profilestring""用於嵌入的 profile 名稱
modelstring""嵌入模型名稱
chunk_sizeinteger512以 token 計的文字區塊大小
top_kinteger5要注入的結果數量