docs(openai): document GPT-5.5 defaults

This commit is contained in:
Peter Steinberger
2026-04-23 20:01:02 +01:00
parent cd5bc2fc93
commit 89051c6bf6
33 changed files with 128 additions and 126 deletions

View File

@@ -43,9 +43,9 @@ OpenClaw has separate routes for OpenAI and Codex-shaped access:
| Model ref | Runtime path | Use when |
| ---------------------- | -------------------------------------------- | ----------------------------------------------------------------------- |
| `openai/gpt-5.4` | OpenAI provider through OpenClaw/PI plumbing | You want direct OpenAI Platform API access with `OPENAI_API_KEY`. |
| `openai-codex/gpt-5.4` | OpenAI Codex OAuth provider through PI | You want ChatGPT/Codex OAuth without the Codex app-server harness. |
| `codex/gpt-5.4` | Bundled Codex provider plus Codex harness | You want native Codex app-server execution for the embedded agent turn. |
| `openai/gpt-5.5` | OpenAI provider through OpenClaw/PI plumbing | You want direct OpenAI Platform API access with `OPENAI_API_KEY`. |
| `openai-codex/gpt-5.5` | OpenAI Codex OAuth provider through PI | You want ChatGPT/Codex OAuth without the Codex app-server harness. |
| `codex/gpt-5.5` | Bundled Codex provider plus Codex harness | You want native Codex app-server execution for the embedded agent turn. |
The Codex harness only claims `codex/*` model refs. Existing `openai/*`,
`openai-codex/*`, Anthropic, Gemini, xAI, local, and custom provider refs keep
@@ -82,7 +82,7 @@ uses.
## Minimal config
Use `codex/gpt-5.4`, enable the bundled plugin, and force the `codex` harness:
Use `codex/gpt-5.5`, enable the bundled plugin, and force the `codex` harness:
```json5
{
@@ -95,7 +95,7 @@ Use `codex/gpt-5.4`, enable the bundled plugin, and force the `codex` harness:
},
agents: {
defaults: {
model: "codex/gpt-5.4",
model: "codex/gpt-5.5",
embeddedHarness: {
runtime: "codex",
fallback: "none",
@@ -141,13 +141,13 @@ everything else:
agents: {
defaults: {
model: {
primary: "codex/gpt-5.4",
fallbacks: ["openai/gpt-5.4", "anthropic/claude-opus-4-6"],
primary: "codex/gpt-5.5",
fallbacks: ["openai/gpt-5.5", "anthropic/claude-opus-4-6"],
},
models: {
"codex/gpt-5.4": { alias: "codex" },
"codex/gpt-5.5": { alias: "codex" },
"codex/gpt-5.4-mini": { alias: "codex-mini" },
"openai/gpt-5.4": { alias: "gpt" },
"openai/gpt-5.5": { alias: "gpt" },
"anthropic/claude-opus-4-6": { alias: "opus" },
},
embeddedHarness: {
@@ -161,8 +161,8 @@ everything else:
With this shape:
- `/model codex` or `/model codex/gpt-5.4` uses the Codex app-server harness.
- `/model gpt` or `/model openai/gpt-5.4` uses the OpenAI provider path.
- `/model codex` or `/model codex/gpt-5.5` uses the Codex app-server harness.
- `/model gpt` or `/model openai/gpt-5.5` uses the OpenAI provider path.
- `/model opus` uses the Anthropic provider path.
- If a non-Codex model is selected, PI remains the compatibility harness.
@@ -175,7 +175,7 @@ the Codex harness:
{
agents: {
defaults: {
model: "codex/gpt-5.4",
model: "codex/gpt-5.5",
embeddedHarness: {
runtime: "codex",
fallback: "none",
@@ -220,7 +220,7 @@ auto-selection:
{
id: "codex",
name: "Codex",
model: "codex/gpt-5.4",
model: "codex/gpt-5.5",
embeddedHarness: {
runtime: "codex",
fallback: "none",
@@ -241,7 +241,7 @@ and lets the next turn resolve the harness from current config again.
By default, the Codex plugin asks the app-server for available models. If
discovery fails or times out, it uses the bundled fallback catalog:
- `codex/gpt-5.4`
- `codex/gpt-5.5`
- `codex/gpt-5.4-mini`
- `codex/gpt-5.2`
@@ -459,7 +459,7 @@ Remote app-server with explicit headers:
Model switching stays OpenClaw-controlled. When an OpenClaw session is attached
to an existing Codex thread, the next turn sends the currently selected
`codex/*` model, provider, approval policy, sandbox, and service tier to
app-server again. Switching from `codex/gpt-5.4` to `codex/gpt-5.2` keeps the
app-server again. Switching from `codex/gpt-5.5` to `codex/gpt-5.2` keeps the
thread binding but asks Codex to continue with the newly selected model.
## Codex command