mirror of
https://github.com/openclaw/openclaw.git
synced 2026-04-06 06:41:08 +00:00
1.4 KiB
1.4 KiB
summary, read_when, title
| summary | read_when | title | ||
|---|---|---|---|---|
| Model providers (LLMs) supported by OpenClaw |
|
Model Provider Quickstart |
Model Providers
OpenClaw can use many LLM providers. Pick one, authenticate, then set the default
model as provider/model.
Quick start (two steps)
- Authenticate with the provider (usually via
openclaw onboard). - Set the default model:
{
agents: { defaults: { model: { primary: "anthropic/claude-opus-4-6" } } },
}
Supported providers (starter set)
- Anthropic (API + Claude Code CLI)
- Amazon Bedrock
- Chutes
- Cloudflare AI Gateway
- GLM models
- MiniMax
- Mistral
- Moonshot AI (Kimi + Kimi Coding)
- OpenAI (API + Codex)
- OpenCode (Zen + Go)
- OpenRouter
- Qianfan
- Qwen
- StepFun
- Synthetic
- Vercel AI Gateway
- Venice (Venice AI)
- xAI
- Z.AI
For the full provider catalog (xAI, Groq, Mistral, etc.) and advanced configuration, see Model providers.