feat(openai): add websocket warm-up with configurable toggle

This commit is contained in:
Peter Steinberger
2026-03-01 22:44:57 +00:00
parent bc9f357ad7
commit d1615eb35f
7 changed files with 296 additions and 5 deletions

View File

@@ -68,6 +68,9 @@ You can set `agents.defaults.models.<provider/model>.params.transport`:
- `"websocket"`: force WebSocket
- `"auto"`: try WebSocket, then fall back to SSE
For `openai/*` (Responses API), OpenClaw also enables WebSocket warm-up by
default (`openaiWsWarmup: true`) when WebSocket transport is used.
```json5
{
agents: {
@@ -85,6 +88,47 @@ You can set `agents.defaults.models.<provider/model>.params.transport`:
}
```
### OpenAI WebSocket warm-up
OpenAI docs describe warm-up as optional. OpenClaw enables it by default for
`openai/*` to reduce first-turn latency when using WebSocket transport.
### Disable warm-up
```json5
{
agents: {
defaults: {
models: {
"openai/gpt-5": {
params: {
openaiWsWarmup: false,
},
},
},
},
},
}
```
### Enable warm-up explicitly
```json5
{
agents: {
defaults: {
models: {
"openai/gpt-5": {
params: {
openaiWsWarmup: true,
},
},
},
},
},
}
```
### OpenAI Responses server-side compaction
For direct OpenAI Responses models (`openai/*` using `api: "openai-responses"` with