docs(openai): canonicalize GPT model refs

This commit is contained in:
Peter Steinberger
2026-04-23 20:38:45 +01:00
parent 17830983ce
commit a8173276bf
14 changed files with 104 additions and 118 deletions

View File

@@ -1257,7 +1257,7 @@ Codex app-server harness.
{
agents: {
defaults: {
model: "codex/gpt-5.5",
model: "openai/gpt-5.5",
embeddedHarness: {
runtime: "codex",
fallback: "none",
@@ -1270,7 +1270,7 @@ Codex app-server harness.
- `runtime`: `"auto"`, `"pi"`, or a registered plugin harness id. The bundled Codex plugin registers `codex`.
- `fallback`: `"pi"` or `"none"`. `"pi"` keeps the built-in PI harness as the compatibility fallback when no plugin harness is selected. `"none"` makes missing or unsupported plugin harness selection fail instead of silently using PI. Selected plugin harness failures always surface directly.
- Environment overrides: `OPENCLAW_AGENT_RUNTIME=<id|auto|pi>` overrides `runtime`; `OPENCLAW_AGENT_HARNESS_FALLBACK=none` disables PI fallback for that process.
- For Codex-only deployments, set `model: "codex/gpt-5.5"`, `embeddedHarness.runtime: "codex"`, and `embeddedHarness.fallback: "none"`.
- For Codex-only deployments, set `model: "openai/gpt-5.5"`, `embeddedHarness.runtime: "codex"`, and `embeddedHarness.fallback: "none"`.
- Harness choice is pinned per session id after the first embedded run. Config/env changes affect new or reset sessions, not an existing transcript. Legacy sessions with transcript history but no recorded pin are treated as PI-pinned. `/status` shows non-PI harness ids such as `codex` next to `Fast`.
- This only controls the embedded chat harness. Media generation, vision, PDF, music, video, and TTS still use their provider/model settings.