mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 17:31:06 +00:00
feat(deepseek): support v4 models
Add DeepSeek V4 Flash/Pro support, update Pi packages to 0.70.2, and handle disabled thinking/None by stripping replayed reasoning content.
This commit is contained in:
@@ -235,6 +235,7 @@ See [/providers/kilocode](/providers/kilocode) for setup details.
|
||||
| BytePlus | `byteplus` / `byteplus-plan` | `BYTEPLUS_API_KEY` | `byteplus-plan/ark-code-latest` |
|
||||
| Cerebras | `cerebras` | `CEREBRAS_API_KEY` | `cerebras/zai-glm-4.7` |
|
||||
| Cloudflare AI Gateway | `cloudflare-ai-gateway` | `CLOUDFLARE_AI_GATEWAY_API_KEY` | — |
|
||||
| DeepSeek | `deepseek` | `DEEPSEEK_API_KEY` | `deepseek/deepseek-v4-flash` |
|
||||
| GitHub Copilot | `github-copilot` | `COPILOT_GITHUB_TOKEN` / `GH_TOKEN` / `GITHUB_TOKEN` | — |
|
||||
| Groq | `groq` | `GROQ_API_KEY` | — |
|
||||
| Hugging Face Inference | `huggingface` | `HUGGINGFACE_HUB_TOKEN` or `HF_TOKEN` | `huggingface/deepseek-ai/DeepSeek-R1` |
|
||||
|
||||
@@ -23,10 +23,10 @@ OpenClaw uses the pi SDK to embed an AI coding agent into its messaging gateway
|
||||
|
||||
```json
|
||||
{
|
||||
"@mariozechner/pi-agent-core": "0.68.1",
|
||||
"@mariozechner/pi-ai": "0.68.1",
|
||||
"@mariozechner/pi-coding-agent": "0.68.1",
|
||||
"@mariozechner/pi-tui": "0.68.1"
|
||||
"@mariozechner/pi-agent-core": "0.70.2",
|
||||
"@mariozechner/pi-ai": "0.70.2",
|
||||
"@mariozechner/pi-coding-agent": "0.70.2",
|
||||
"@mariozechner/pi-tui": "0.70.2"
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
@@ -26,7 +26,7 @@ read_when:
|
||||
openclaw onboard --auth-choice deepseek-api-key
|
||||
```
|
||||
|
||||
This will prompt for your API key and set `deepseek/deepseek-chat` as the default model.
|
||||
This will prompt for your API key and set `deepseek/deepseek-v4-flash` as the default model.
|
||||
|
||||
</Step>
|
||||
<Step title="Verify models are available">
|
||||
@@ -60,13 +60,17 @@ is available to that process (for example, in `~/.openclaw/.env` or via
|
||||
|
||||
## Built-in catalog
|
||||
|
||||
| Model ref | Name | Input | Context | Max output | Notes |
|
||||
| ---------------------------- | ----------------- | ----- | ------- | ---------- | ------------------------------------------------- |
|
||||
| `deepseek/deepseek-chat` | DeepSeek Chat | text | 131,072 | 8,192 | Default model; DeepSeek V3.2 non-thinking surface |
|
||||
| `deepseek/deepseek-reasoner` | DeepSeek Reasoner | text | 131,072 | 65,536 | Reasoning-enabled V3.2 surface |
|
||||
| Model ref | Name | Input | Context | Max output | Notes |
|
||||
| ---------------------------- | ----------------- | ----- | --------- | ---------- | ------------------------------------------ |
|
||||
| `deepseek/deepseek-v4-flash` | DeepSeek V4 Flash | text | 1,000,000 | 384,000 | Default model; V4 thinking-capable surface |
|
||||
| `deepseek/deepseek-v4-pro` | DeepSeek V4 Pro | text | 1,000,000 | 384,000 | V4 thinking-capable surface |
|
||||
| `deepseek/deepseek-chat` | DeepSeek Chat | text | 131,072 | 8,192 | DeepSeek V3.2 non-thinking surface |
|
||||
| `deepseek/deepseek-reasoner` | DeepSeek Reasoner | text | 131,072 | 65,536 | Reasoning-enabled V3.2 surface |
|
||||
|
||||
<Tip>
|
||||
Both bundled models currently advertise streaming usage compatibility in source.
|
||||
V4 models support DeepSeek's `thinking` control. OpenClaw also replays
|
||||
DeepSeek `reasoning_content` on follow-up turns so thinking sessions with tool
|
||||
calls can continue.
|
||||
</Tip>
|
||||
|
||||
## Config example
|
||||
@@ -76,7 +80,7 @@ Both bundled models currently advertise streaming usage compatibility in source.
|
||||
env: { DEEPSEEK_API_KEY: "sk-..." },
|
||||
agents: {
|
||||
defaults: {
|
||||
model: { primary: "deepseek/deepseek-chat" },
|
||||
model: { primary: "deepseek/deepseek-v4-flash" },
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user