mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 17:10:49 +00:00
docs(openai): document GPT-5.5 defaults
This commit is contained in:
@@ -43,9 +43,9 @@ OpenClaw has separate routes for OpenAI and Codex-shaped access:
|
||||
|
||||
| Model ref | Runtime path | Use when |
|
||||
| ---------------------- | -------------------------------------------- | ----------------------------------------------------------------------- |
|
||||
| `openai/gpt-5.4` | OpenAI provider through OpenClaw/PI plumbing | You want direct OpenAI Platform API access with `OPENAI_API_KEY`. |
|
||||
| `openai-codex/gpt-5.4` | OpenAI Codex OAuth provider through PI | You want ChatGPT/Codex OAuth without the Codex app-server harness. |
|
||||
| `codex/gpt-5.4` | Bundled Codex provider plus Codex harness | You want native Codex app-server execution for the embedded agent turn. |
|
||||
| `openai/gpt-5.5` | OpenAI provider through OpenClaw/PI plumbing | You want direct OpenAI Platform API access with `OPENAI_API_KEY`. |
|
||||
| `openai-codex/gpt-5.5` | OpenAI Codex OAuth provider through PI | You want ChatGPT/Codex OAuth without the Codex app-server harness. |
|
||||
| `codex/gpt-5.5` | Bundled Codex provider plus Codex harness | You want native Codex app-server execution for the embedded agent turn. |
|
||||
|
||||
The Codex harness only claims `codex/*` model refs. Existing `openai/*`,
|
||||
`openai-codex/*`, Anthropic, Gemini, xAI, local, and custom provider refs keep
|
||||
@@ -82,7 +82,7 @@ uses.
|
||||
|
||||
## Minimal config
|
||||
|
||||
Use `codex/gpt-5.4`, enable the bundled plugin, and force the `codex` harness:
|
||||
Use `codex/gpt-5.5`, enable the bundled plugin, and force the `codex` harness:
|
||||
|
||||
```json5
|
||||
{
|
||||
@@ -95,7 +95,7 @@ Use `codex/gpt-5.4`, enable the bundled plugin, and force the `codex` harness:
|
||||
},
|
||||
agents: {
|
||||
defaults: {
|
||||
model: "codex/gpt-5.4",
|
||||
model: "codex/gpt-5.5",
|
||||
embeddedHarness: {
|
||||
runtime: "codex",
|
||||
fallback: "none",
|
||||
@@ -141,13 +141,13 @@ everything else:
|
||||
agents: {
|
||||
defaults: {
|
||||
model: {
|
||||
primary: "codex/gpt-5.4",
|
||||
fallbacks: ["openai/gpt-5.4", "anthropic/claude-opus-4-6"],
|
||||
primary: "codex/gpt-5.5",
|
||||
fallbacks: ["openai/gpt-5.5", "anthropic/claude-opus-4-6"],
|
||||
},
|
||||
models: {
|
||||
"codex/gpt-5.4": { alias: "codex" },
|
||||
"codex/gpt-5.5": { alias: "codex" },
|
||||
"codex/gpt-5.4-mini": { alias: "codex-mini" },
|
||||
"openai/gpt-5.4": { alias: "gpt" },
|
||||
"openai/gpt-5.5": { alias: "gpt" },
|
||||
"anthropic/claude-opus-4-6": { alias: "opus" },
|
||||
},
|
||||
embeddedHarness: {
|
||||
@@ -161,8 +161,8 @@ everything else:
|
||||
|
||||
With this shape:
|
||||
|
||||
- `/model codex` or `/model codex/gpt-5.4` uses the Codex app-server harness.
|
||||
- `/model gpt` or `/model openai/gpt-5.4` uses the OpenAI provider path.
|
||||
- `/model codex` or `/model codex/gpt-5.5` uses the Codex app-server harness.
|
||||
- `/model gpt` or `/model openai/gpt-5.5` uses the OpenAI provider path.
|
||||
- `/model opus` uses the Anthropic provider path.
|
||||
- If a non-Codex model is selected, PI remains the compatibility harness.
|
||||
|
||||
@@ -175,7 +175,7 @@ the Codex harness:
|
||||
{
|
||||
agents: {
|
||||
defaults: {
|
||||
model: "codex/gpt-5.4",
|
||||
model: "codex/gpt-5.5",
|
||||
embeddedHarness: {
|
||||
runtime: "codex",
|
||||
fallback: "none",
|
||||
@@ -220,7 +220,7 @@ auto-selection:
|
||||
{
|
||||
id: "codex",
|
||||
name: "Codex",
|
||||
model: "codex/gpt-5.4",
|
||||
model: "codex/gpt-5.5",
|
||||
embeddedHarness: {
|
||||
runtime: "codex",
|
||||
fallback: "none",
|
||||
@@ -241,7 +241,7 @@ and lets the next turn resolve the harness from current config again.
|
||||
By default, the Codex plugin asks the app-server for available models. If
|
||||
discovery fails or times out, it uses the bundled fallback catalog:
|
||||
|
||||
- `codex/gpt-5.4`
|
||||
- `codex/gpt-5.5`
|
||||
- `codex/gpt-5.4-mini`
|
||||
- `codex/gpt-5.2`
|
||||
|
||||
@@ -459,7 +459,7 @@ Remote app-server with explicit headers:
|
||||
Model switching stays OpenClaw-controlled. When an OpenClaw session is attached
|
||||
to an existing Codex thread, the next turn sends the currently selected
|
||||
`codex/*` model, provider, approval policy, sandbox, and service tier to
|
||||
app-server again. Switching from `codex/gpt-5.4` to `codex/gpt-5.2` keeps the
|
||||
app-server again. Switching from `codex/gpt-5.5` to `codex/gpt-5.2` keeps the
|
||||
thread binding but asks Codex to continue with the newly selected model.
|
||||
|
||||
## Codex command
|
||||
|
||||
@@ -483,7 +483,7 @@ Each channel entry can include:
|
||||
## modelSupport reference
|
||||
|
||||
Use `modelSupport` when OpenClaw should infer your provider plugin from
|
||||
shorthand model ids like `gpt-5.4` or `claude-sonnet-4.6` before plugin runtime
|
||||
shorthand model ids like `gpt-5.5` or `claude-sonnet-4.6` before plugin runtime
|
||||
loads.
|
||||
|
||||
```json
|
||||
|
||||
@@ -122,7 +122,7 @@ OpenClaw. The harness then claims that provider in `supports(...)`.
|
||||
The bundled Codex plugin follows this pattern:
|
||||
|
||||
- provider id: `codex`
|
||||
- user model refs: `codex/gpt-5.4`, `codex/gpt-5.2`, or another model returned
|
||||
- user model refs: `codex/gpt-5.5`, `codex/gpt-5.2`, or another model returned
|
||||
by the Codex app server
|
||||
- harness id: `codex`
|
||||
- auth: synthetic provider availability, because the Codex harness owns the
|
||||
@@ -189,7 +189,7 @@ For Codex-only embedded runs:
|
||||
{
|
||||
"agents": {
|
||||
"defaults": {
|
||||
"model": "codex/gpt-5.4",
|
||||
"model": "codex/gpt-5.5",
|
||||
"embeddedHarness": {
|
||||
"runtime": "codex",
|
||||
"fallback": "none"
|
||||
@@ -230,7 +230,7 @@ Per-agent overrides use the same shape:
|
||||
"list": [
|
||||
{
|
||||
"id": "codex-only",
|
||||
"model": "codex/gpt-5.4",
|
||||
"model": "codex/gpt-5.5",
|
||||
"embeddedHarness": {
|
||||
"runtime": "codex",
|
||||
"fallback": "none"
|
||||
|
||||
Reference in New Issue
Block a user