docs: clarify Codex subscription runtime (#75910)

This commit is contained in:
pashpashpash
2026-05-01 19:33:20 -07:00
committed by GitHub
parent f6cb44a5a3
commit 9fb90f3d29
10 changed files with 143 additions and 101 deletions

View File

@@ -37,17 +37,17 @@ There are two runtime families:
through Claude CLI." `claude-cli` is not an embedded harness id and must not
be passed to AgentHarness selection.
## Three things named Codex
## Codex surfaces
Most confusion comes from three different surfaces sharing the Codex name:
Most confusion comes from several different surfaces sharing the Codex name:
| Surface | OpenClaw name/config | What it does |
| ---------------------------------------------------- | ------------------------------------ | --------------------------------------------------------------------------------------------------- |
| Codex OAuth provider route | `openai-codex/*` model refs | Uses ChatGPT/Codex subscription OAuth through the normal OpenClaw PI runner. |
| Native Codex app-server runtime | `agentRuntime.id: "codex"` | Runs the embedded agent turn through the bundled Codex app-server harness. |
| Codex ACP adapter | `runtime: "acp"`, `agentId: "codex"` | Runs Codex through the external ACP/acpx control plane. Use only when ACP/acpx is explicitly asked. |
| Native Codex chat-control command set | `/codex ...` | Binds, resumes, steers, stops, and inspects Codex app-server threads from chat. |
| OpenAI Platform API route for GPT/Codex-style models | `openai/*` model refs | Uses OpenAI API-key auth unless a runtime override, such as `runtime: "codex"`, runs the turn. |
| Surface | OpenClaw name/config | What it does |
| ---------------------------------------------------- | ------------------------------------------ | ---------------------------------------------------------------------------------------------------------- |
| Native Codex app-server runtime | `openai/*` plus `agentRuntime.id: "codex"` | Runs the embedded agent turn through Codex app-server. This is the usual ChatGPT/Codex subscription setup. |
| Codex OAuth provider route | `openai-codex/*` model refs | Uses ChatGPT/Codex subscription OAuth through the normal OpenClaw PI runner. |
| Codex ACP adapter | `runtime: "acp"`, `agentId: "codex"` | Runs Codex through the external ACP/acpx control plane. Use only when ACP/acpx is explicitly asked. |
| Native Codex chat-control command set | `/codex ...` | Binds, resumes, steers, stops, and inspects Codex app-server threads from chat. |
| OpenAI Platform API route for GPT/Codex-style models | `openai/*` model refs | Uses OpenAI API-key auth unless a runtime override, such as `agentRuntime.id: "codex"`, runs the turn. |
Those surfaces are intentionally independent. Enabling the `codex` plugin makes
the native app-server features available; it does not rewrite
@@ -55,7 +55,8 @@ the native app-server features available; it does not rewrite
not make ACP the Codex default. Selecting `openai-codex/*` means "use the Codex
OAuth provider route" unless you separately force a runtime.
The common Codex setup uses the `openai` provider with the `codex` runtime:
The common ChatGPT/Codex subscription setup uses Codex OAuth for auth, but keeps
the model ref as `openai/*` and selects the `codex` runtime:
```json5
{
@@ -71,8 +72,9 @@ The common Codex setup uses the `openai` provider with the `codex` runtime:
```
That means OpenClaw selects an OpenAI model ref, then asks the Codex app-server
runtime to run the embedded agent turn. It does not mean the channel, model
provider catalog, or OpenClaw session store becomes Codex.
runtime to run the embedded agent turn. It does not mean "use API billing," and
it does not mean the channel, model provider catalog, or OpenClaw session store
becomes Codex.
When the bundled `codex` plugin is enabled, natural-language Codex control
should use the native `/codex` command surface (`/codex bind`, `/codex threads`,
@@ -85,7 +87,8 @@ This is the agent-facing decision tree:
1. If the user asks for **Codex bind/control/thread/resume/steer/stop**, use the
native `/codex` command surface when the bundled `codex` plugin is enabled.
2. If the user asks for **Codex as the embedded runtime**, use
2. If the user asks for **Codex as the embedded runtime** or wants the normal
subscription-backed Codex agent experience, use
`openai/<model>` with `agentRuntime.id: "codex"`.
3. If the user asks for **Codex OAuth/subscription auth on the normal OpenClaw
runner**, use `openai-codex/<model>` and leave the runtime as PI.
@@ -142,10 +145,10 @@ OpenClaw chooses an embedded runtime after provider and model resolution:
`fallback: "none"` to make unmatched `auto`-mode selection fail instead.
Explicit plugin runtimes fail closed by default. For example,
`runtime: "codex"` means Codex or a clear selection error unless you set
`agentRuntime.id: "codex"` means Codex or a clear selection error unless you set
`fallback: "pi"` in the same override scope. A runtime override does not inherit
a broader fallback setting, so an agent-level `runtime: "codex"` is not silently
routed back to PI just because defaults used `fallback: "pi"`.
a broader fallback setting, so an agent-level `agentRuntime.id: "codex"` is not
silently routed back to PI just because defaults used `fallback: "pi"`.
CLI backend aliases are different from embedded harness ids. The preferred
Claude CLI form is:

View File

@@ -29,15 +29,15 @@ Reference for **LLM/model providers** (not chat channels like WhatsApp/Telegram)
<Accordion title="OpenAI provider/runtime split">
OpenAI-family routes are prefix-specific:
- `openai/<model>` uses the direct OpenAI API-key provider in PI.
- `openai/<model>` plus `agents.defaults.agentRuntime.id: "codex"` uses the native Codex app-server harness. This is the usual ChatGPT/Codex subscription setup.
- `openai-codex/<model>` uses Codex OAuth in PI.
- `openai/<model>` plus `agents.defaults.agentRuntime.id: "codex"` uses the native Codex app-server harness.
- `openai/<model>` without a Codex runtime override uses the direct OpenAI API-key provider in PI.
See [OpenAI](/providers/openai) and [Codex harness](/plugins/codex-harness). If the provider/runtime split is confusing, read [Agent runtimes](/concepts/agent-runtimes) first.
Plugin auto-enable follows the same boundary: `openai-codex/<model>` belongs to the OpenAI plugin, while the Codex plugin is enabled by `agentRuntime.id: "codex"` or legacy `codex/<model>` refs.
GPT-5.5 is available through `openai/gpt-5.5` for direct API-key traffic, `openai-codex/gpt-5.5` in PI for Codex OAuth, and the native Codex app-server harness when `agentRuntime.id: "codex"` is set.
GPT-5.5 is available through the native Codex app-server harness when `agentRuntime.id: "codex"` is set, through `openai-codex/gpt-5.5` in PI for Codex OAuth, and through `openai/gpt-5.5` in PI for direct API-key traffic when your account exposes it.
</Accordion>
<Accordion title="CLI runtimes">
@@ -148,11 +148,18 @@ Anthropic staff told us OpenClaw-style Claude CLI usage is allowed again, so Ope
- Shares the same `/fast` toggle and `params.fastMode` config as direct `openai/*`; OpenClaw maps that to `service_tier=priority`
- `openai-codex/gpt-5.5` uses the Codex catalog native `contextWindow = 400000` and default runtime `contextTokens = 272000`; override the runtime cap with `models.providers.openai-codex.models[].contextTokens`
- Policy note: OpenAI Codex OAuth is explicitly supported for external tools/workflows like OpenClaw.
- Use `openai-codex/gpt-5.5` when you want the Codex OAuth/subscription route; use `openai/gpt-5.5` when your API-key setup and local catalog expose the public API route.
- For the common subscription plus native Codex runtime route, sign in with `openai-codex` auth but configure `openai/gpt-5.5` plus `agents.defaults.agentRuntime.id: "codex"`.
- Use `openai-codex/gpt-5.5` only when you want the Codex OAuth/subscription route through PI; use `openai/gpt-5.5` without the Codex runtime override when your API-key setup and local catalog expose the public API route.
```json5
{
agents: { defaults: { model: { primary: "openai-codex/gpt-5.5" } } },
plugins: { entries: { codex: { enabled: true } } },
agents: {
defaults: {
model: { primary: "openai/gpt-5.5" },
agentRuntime: { id: "codex", fallback: "none" },
},
},
}
```

View File

@@ -23,7 +23,7 @@ sidebarTitle: "Models CLI"
</Card>
</CardGroup>
Model refs choose a provider and model. They do not usually choose the low-level agent runtime. For example, `openai/gpt-5.5` can run through the normal OpenAI provider path or through the Codex app-server runtime, depending on `agents.defaults.agentRuntime.id`. See [Agent runtimes](/concepts/agent-runtimes).
Model refs choose a provider and model. They do not usually choose the low-level agent runtime. For example, `openai/gpt-5.5` can run through the normal OpenAI provider path or through the Codex app-server runtime, depending on `agents.defaults.agentRuntime.id`. In Codex runtime mode, the `openai/gpt-*` ref does not imply API-key billing; auth can come from a Codex account or `openai-codex` auth profile. See [Agent runtimes](/concepts/agent-runtimes).
## How model selection works

View File

@@ -260,7 +260,7 @@ That stages grounded durable candidates into the short-term dreaming store while
Doctor does not repair this automatically because both routes are valid:
- `openai-codex/*` + PI means "use Codex OAuth/subscription auth through the normal OpenClaw runner."
- `openai/*` + `runtime: "codex"` means "run the embedded turn through native Codex app-server."
- `openai/*` + `agentRuntime.id: "codex"` means "run the embedded turn through native Codex app-server."
- `/codex ...` means "control or bind a native Codex conversation from chat."
- `/acp ...` or `runtime: "acp"` means "use the external ACP/acpx adapter."

View File

@@ -594,10 +594,11 @@ and troubleshooting see the main [FAQ](/help/faq).
<Accordion title="How does Codex auth work?">
OpenClaw supports **OpenAI Code (Codex)** via OAuth (ChatGPT sign-in). Use
`openai-codex/gpt-5.5` for Codex OAuth through the default PI runner. Use
`openai/gpt-5.5` for direct OpenAI API-key access. GPT-5.5 can also use
subscription/OAuth via `openai-codex/gpt-5.5` or native Codex app-server
runs with `openai/gpt-5.5` and `agentRuntime.id: "codex"`.
`openai/gpt-5.5` with `agentRuntime.id: "codex"` for the common setup:
ChatGPT/Codex subscription auth plus native Codex app-server execution. Use
`openai-codex/gpt-5.5` only when you want Codex OAuth through the default
PI runner. Use `openai/gpt-5.5` without the Codex runtime override for
direct OpenAI API-key access.
See [Model providers](/concepts/model-providers) and [Onboarding (CLI)](/start/wizard).
</Accordion>
@@ -605,15 +606,17 @@ and troubleshooting see the main [FAQ](/help/faq).
`openai-codex` is the provider and auth-profile id for ChatGPT/Codex OAuth.
It is also the explicit PI model prefix for Codex OAuth:
- `openai/gpt-5.5` = current direct OpenAI API-key route in PI
- `openai/gpt-5.5` + `agentRuntime.id: "codex"` = ChatGPT/Codex subscription auth with native Codex runtime
- `openai-codex/gpt-5.5` = Codex OAuth route in PI
- `openai/gpt-5.5` + `agentRuntime.id: "codex"` = native Codex app-server route
- `openai/gpt-5.5` without a Codex runtime override = direct OpenAI API-key route in PI
- `openai-codex:...` = auth profile id, not a model ref
If you want the direct OpenAI Platform billing/limit path, set
`OPENAI_API_KEY`. If you want ChatGPT/Codex subscription auth, sign in with
`openclaw models auth login --provider openai-codex` and use
`openai-codex/*` model refs for PI runs.
`openclaw models auth login --provider openai-codex`. For native Codex
runtime, keep the model ref as `openai/gpt-5.5` and set
`agentRuntime.id: "codex"`. Use `openai-codex/*` model refs only for PI
runs.
</Accordion>

View File

@@ -145,11 +145,12 @@ troubleshooting, see the main [FAQ](/help/faq).
</Accordion>
<Accordion title="Can I use GPT 5.5 for daily tasks and Codex 5.5 for coding?">
Yes. Set one as default and switch as needed:
Yes. Treat model choice and runtime choice separately:
- **Quick switch (per session):** `/model openai/gpt-5.5` for current direct OpenAI API-key tasks or `/model openai-codex/gpt-5.5` for GPT-5.5 Codex OAuth tasks.
- **Default:** set `agents.defaults.model.primary` to `openai/gpt-5.5` for API-key usage or `openai-codex/gpt-5.5` for GPT-5.5 Codex OAuth usage.
- **Sub-agents:** route coding tasks to sub-agents with a different default model.
- **Native Codex coding agent:** set `agents.defaults.model.primary` to `openai/gpt-5.5` and `agents.defaults.agentRuntime.id` to `"codex"`. Sign in with `openclaw models auth login --provider openai-codex` when you want ChatGPT/Codex subscription auth.
- **Direct OpenAI API tasks through PI:** use `/model openai/gpt-5.5` without a Codex runtime override and configure `OPENAI_API_KEY`.
- **Codex OAuth through PI:** use `/model openai-codex/gpt-5.5` only when you intentionally want the normal PI runner with Codex OAuth.
- **Sub-agents:** route coding tasks to a Codex-only agent with its own model and `agentRuntime` default.
See [Models](/concepts/models) and [Slash commands](/tools/slash-commands).

View File

@@ -33,9 +33,19 @@ Discord, Slack, or another channel remains the communication surface.
## Quick config
To use the Codex harness for GPT agent turns, keep the model ref canonical as
`openai/gpt-*`, enable the bundled `codex` plugin, and set
`agentRuntime.id: "codex"`:
Most users who want "Codex in OpenClaw" want this route: sign in with a
ChatGPT/Codex subscription, then run embedded agent turns through the native
Codex app-server runtime. The model ref still stays canonical as
`openai/gpt-*`; subscription auth comes from the Codex account/profile, not
from an `openai-codex/*` model prefix.
First sign in with Codex OAuth if you have not already:
```bash
openclaw models auth login --provider openai-codex
```
Then enable the bundled `codex` plugin and force the Codex runtime:
```json5
{
@@ -73,9 +83,9 @@ If your config uses `plugins.allow`, include `codex` there too:
}
```
Do not use `openai-codex/gpt-*` for this path. That selects Codex OAuth through
the normal PI runner unless you separately force a runtime. Config changes apply
to new or reset sessions; existing sessions keep their recorded runtime.
Do not use `openai-codex/gpt-*` when you mean native Codex runtime. That prefix
is the explicit "Codex OAuth through PI" route. Config changes apply to new or
reset sessions; existing sessions keep their recorded runtime.
## What this plugin changes
@@ -140,13 +150,13 @@ native app-server execution stays an explicit runtime choice.
Use this table before changing config:
| Desired behavior | Model ref | Runtime config | Plugin requirement | Expected status label |
| ------------------------------------------- | -------------------------- | -------------------------------------- | --------------------------- | ------------------------------ |
| OpenAI API through normal OpenClaw runner | `openai/gpt-*` | omitted or `runtime: "pi"` | OpenAI provider | `Runtime: OpenClaw Pi Default` |
| Codex OAuth/subscription through PI | `openai-codex/gpt-*` | omitted or `runtime: "pi"` | OpenAI Codex OAuth provider | `Runtime: OpenClaw Pi Default` |
| Native Codex app-server embedded turns | `openai/gpt-*` | `agentRuntime.id: "codex"` | `codex` plugin | `Runtime: OpenAI Codex` |
| Mixed providers with conservative auto mode | provider-specific refs | `agentRuntime.id: "auto"` | Optional plugin runtimes | Depends on selected runtime |
| Explicit Codex ACP adapter session | ACP prompt/model dependent | `sessions_spawn` with `runtime: "acp"` | healthy `acpx` backend | ACP task/session status |
| Desired behavior | Model ref | Runtime config | Auth/profile route | Expected status label |
| ---------------------------------------------------- | -------------------------- | -------------------------------------- | ---------------------------- | ------------------------------ |
| ChatGPT/Codex subscription with native Codex runtime | `openai/gpt-*` | `agentRuntime.id: "codex"` | Codex OAuth or Codex account | `Runtime: OpenAI Codex` |
| OpenAI API through normal OpenClaw runner | `openai/gpt-*` | omitted or `runtime: "pi"` | OpenAI API key | `Runtime: OpenClaw Pi Default` |
| ChatGPT/Codex subscription through PI | `openai-codex/gpt-*` | omitted or `runtime: "pi"` | OpenAI Codex OAuth provider | `Runtime: OpenClaw Pi Default` |
| Mixed providers with conservative auto mode | provider-specific refs | `agentRuntime.id: "auto"` | Per selected provider | Depends on selected runtime |
| Explicit Codex ACP adapter session | ACP prompt/model dependent | `sessions_spawn` with `runtime: "acp"` | ACP backend auth | ACP task/session status |
The important split is provider versus runtime:
@@ -159,20 +169,20 @@ The important split is provider versus runtime:
## Pick the right model prefix
OpenAI-family routes are prefix-specific. Use `openai-codex/*` when you want
Codex OAuth through PI; use `openai/*` when you want direct OpenAI API access or
when you are forcing the native Codex app-server harness:
OpenAI-family routes are prefix-specific. For the common subscription plus
native Codex runtime setup, use `openai/*` with `agentRuntime.id: "codex"`.
Use `openai-codex/*` only when you intentionally want Codex OAuth through PI:
| Model ref | Runtime path | Use when |
| --------------------------------------------- | -------------------------------------------- | ------------------------------------------------------------------------- |
| `openai/gpt-5.4` | OpenAI provider through OpenClaw/PI plumbing | You want current direct OpenAI Platform API access with `OPENAI_API_KEY`. |
| `openai-codex/gpt-5.5` | OpenAI Codex OAuth through OpenClaw/PI | You want ChatGPT/Codex subscription auth with the default PI runner. |
| `openai/gpt-5.5` + `agentRuntime.id: "codex"` | Codex app-server harness | You want native Codex app-server execution for the embedded agent turn. |
| `openai/gpt-5.5` + `agentRuntime.id: "codex"` | Codex app-server harness | You want ChatGPT/Codex subscription auth with native Codex execution. |
GPT-5.5 is currently subscription/OAuth-only in OpenClaw. Use
`openai-codex/gpt-5.5` for PI OAuth, or `openai/gpt-5.5` with the Codex
app-server harness. Direct API-key access for `openai/gpt-5.5` is supported
once OpenAI enables GPT-5.5 on the public API.
GPT-5.5 can appear on both direct OpenAI API-key and Codex subscription routes
when your account exposes them. Use `openai/gpt-5.5` with the Codex app-server
harness for native Codex runtime, `openai-codex/gpt-5.5` for PI OAuth, or
`openai/gpt-5.5` without a Codex runtime override for direct API-key traffic.
Legacy `codex/gpt-*` refs remain accepted as compatibility aliases. Doctor
compatibility migration rewrites legacy primary runtime refs to canonical model
@@ -314,17 +324,17 @@ With this shape:
Agents should route user requests by intent, not by the word "Codex" alone:
| User asks for... | Agent should use... |
| -------------------------------------------------------- | ------------------------------------------------ |
| "Bind this chat to Codex" | `/codex bind` |
| "Resume Codex thread `<id>` here" | `/codex resume <id>` |
| "Show Codex threads" | `/codex threads` |
| "File a support report for a bad Codex run" | `/diagnostics [note]` |
| "Only send Codex feedback for this attached thread" | `/codex diagnostics [note]` |
| "Use Codex as the runtime for this agent" | config change to `agentRuntime.id` |
| "Use my ChatGPT/Codex subscription with normal OpenClaw" | `openai-codex/*` model refs |
| "Run Codex through ACP/acpx" | ACP `sessions_spawn({ runtime: "acp", ... })` |
| "Start Claude Code/Gemini/OpenCode/Cursor in a thread" | ACP/acpx, not `/codex` and not native sub-agents |
| User asks for... | Agent should use... |
| ------------------------------------------------------ | ------------------------------------------------ |
| "Bind this chat to Codex" | `/codex bind` |
| "Resume Codex thread `<id>` here" | `/codex resume <id>` |
| "Show Codex threads" | `/codex threads` |
| "File a support report for a bad Codex run" | `/diagnostics [note]` |
| "Only send Codex feedback for this attached thread" | `/codex diagnostics [note]` |
| "Use my ChatGPT/Codex subscription with Codex runtime" | `openai/*` plus `agentRuntime.id: "codex"` |
| "Use my ChatGPT/Codex subscription through PI" | `openai-codex/*` model refs |
| "Run Codex through ACP/acpx" | ACP `sessions_spawn({ runtime: "acp", ... })` |
| "Start Claude Code/Gemini/OpenCode/Cursor in a thread" | ACP/acpx, not `/codex` and not native sub-agents |
OpenClaw only advertises ACP spawn guidance to agents when ACP is enabled,
dispatchable, and backed by a loaded runtime backend. If ACP is not available,

View File

@@ -216,10 +216,10 @@ to PI.
In `auto` mode, set `fallback: "none"` when you need missing plugin harness
selection to fail instead of using PI. Explicit plugin runtimes such as
`runtime: "codex"` already fail closed by default, unless `fallback: "pi"` is
set in the same config or environment override scope. Selected plugin harness
failures always fail hard. This does not block an explicit `runtime: "pi"` or
`OPENCLAW_AGENT_RUNTIME=pi`.
`agentRuntime.id: "codex"` already fail closed by default, unless
`fallback: "pi"` is set in the same config or environment override scope.
Selected plugin harness failures always fail hard. This does not block an
explicit `agentRuntime.id: "pi"` or `OPENCLAW_AGENT_RUNTIME=pi`.
For Codex-only embedded runs:

View File

@@ -11,13 +11,14 @@ OpenAI provides developer APIs for GPT models, and Codex is also available as a
ChatGPT-plan coding agent through OpenAI's Codex clients. OpenClaw keeps those
surfaces separate so config stays predictable.
OpenClaw supports three OpenAI-family routes. The model prefix selects the
provider/auth route; a separate runtime setting selects who executes the
embedded agent loop:
OpenClaw supports three OpenAI-family routes. Most ChatGPT/Codex subscribers
who want Codex behavior should use the native Codex app-server runtime. The
model prefix selects the provider/model name; a separate runtime setting selects
who executes the embedded agent loop:
- **API key** direct OpenAI Platform access with usage-based billing (`openai/*` models)
- **Codex subscription through PI** ChatGPT/Codex sign-in with subscription access (`openai-codex/*` models)
- **Codex app-server harness** — native Codex app-server execution (`openai/*` models plus `agents.defaults.agentRuntime.id: "codex"`)
- **API key** - direct OpenAI Platform access with usage-based billing (`openai/*` models)
- **Codex subscription with native Codex runtime** - ChatGPT/Codex sign-in plus Codex app-server execution (`openai/*` models plus `agents.defaults.agentRuntime.id: "codex"`)
- **Codex subscription through PI** - ChatGPT/Codex sign-in with the normal OpenClaw PI runner (`openai-codex/*` models)
OpenAI explicitly supports subscription OAuth usage in external tools and workflows like OpenClaw.
@@ -27,13 +28,13 @@ changing config.
## Quick choice
| Goal | Use | Notes |
| --------------------------------------------- | ------------------------------------------------ | ---------------------------------------------------------------------------- |
| Direct API-key billing | `openai/gpt-5.5` | Set `OPENAI_API_KEY` or run OpenAI API-key onboarding. |
| GPT-5.5 with ChatGPT/Codex subscription auth | `openai-codex/gpt-5.5` | Default PI route for Codex OAuth. Best first choice for subscription setups. |
| GPT-5.5 with native Codex app-server behavior | `openai/gpt-5.5` plus `agentRuntime.id: "codex"` | Forces the Codex app-server harness for that model ref. |
| Image generation or editing | `openai/gpt-image-2` | Works with either `OPENAI_API_KEY` or OpenAI Codex OAuth. |
| Transparent-background images | `openai/gpt-image-1.5` | Use `outputFormat=png` or `webp` and `openai.background=transparent`. |
| Goal | Use | Notes |
| ---------------------------------------------------- | ------------------------------------------------ | ------------------------------------------------------------------------- |
| ChatGPT/Codex subscription with native Codex runtime | `openai/gpt-5.5` plus `agentRuntime.id: "codex"` | Recommended Codex setup for most users. Sign in with `openai-codex` auth. |
| Direct API-key billing | `openai/gpt-5.5` | Set `OPENAI_API_KEY` or run OpenAI API-key onboarding. |
| ChatGPT/Codex subscription auth through PI | `openai-codex/gpt-5.5` | Use only when you intentionally want the normal PI runner. |
| Image generation or editing | `openai/gpt-image-2` | Works with either `OPENAI_API_KEY` or OpenAI Codex OAuth. |
| Transparent-background images | `openai/gpt-image-1.5` | Use `outputFormat=png` or `webp` and `openai.background=transparent`. |
## Naming map
@@ -55,10 +56,10 @@ combination so you can confirm it is intentional; it does not rewrite it.
<Note>
GPT-5.5 is available through both direct OpenAI Platform API-key access and
subscription/OAuth routes. Use `openai/gpt-5.5` for direct `OPENAI_API_KEY`
traffic, `openai-codex/gpt-5.5` for Codex OAuth through PI, or
`openai/gpt-5.5` with `agentRuntime.id: "codex"` for the native Codex
app-server harness.
subscription/OAuth routes. For ChatGPT/Codex subscription plus native Codex
execution, use `openai/gpt-5.5` with `agentRuntime.id: "codex"`. Use
`openai-codex/gpt-5.5` only for Codex OAuth through PI, or `openai/gpt-5.5`
without a Codex runtime override for direct `OPENAI_API_KEY` traffic.
</Note>
<Note>
@@ -171,7 +172,7 @@ Choose your preferred auth method and follow the setup steps.
</Tab>
<Tab title="Codex subscription">
**Best for:** using your ChatGPT/Codex subscription instead of a separate API key. Codex cloud requires ChatGPT sign-in.
**Best for:** using your ChatGPT/Codex subscription with native Codex app-server execution instead of a separate API key. Codex cloud requires ChatGPT sign-in.
<Steps>
<Step title="Run Codex OAuth">
@@ -191,15 +192,20 @@ Choose your preferred auth method and follow the setup steps.
openclaw models auth login --provider openai-codex --device-code
```
</Step>
<Step title="Set the default model">
<Step title="Use the native Codex runtime">
```bash
openclaw config set agents.defaults.model.primary openai-codex/gpt-5.5
openclaw config set plugins.entries.codex '{ enabled: true }' --strict-json --merge
openclaw config set agents.defaults.model.primary openai/gpt-5.5
openclaw config set agents.defaults.agentRuntime '{ id: "codex", fallback: "none" }' --strict-json
```
</Step>
<Step title="Verify the model is available">
<Step title="Verify Codex auth is available">
```bash
openclaw models list --provider openai-codex
```
After the gateway is running, send `/codex status` or `/codex models`
in chat to verify the native app-server runtime.
</Step>
</Steps>
@@ -207,25 +213,37 @@ Choose your preferred auth method and follow the setup steps.
| Model ref | Runtime config | Route | Auth |
|-----------|----------------|-------|------|
| `openai/gpt-5.5` | `agentRuntime.id: "codex"` | Native Codex app-server harness | Codex sign-in or selected `openai-codex` profile |
| `openai-codex/gpt-5.5` | omitted / `runtime: "pi"` | ChatGPT/Codex OAuth through PI | Codex sign-in |
| `openai-codex/gpt-5.4-mini` | omitted / `runtime: "pi"` | ChatGPT/Codex OAuth through PI | Codex sign-in |
| `openai-codex/gpt-5.5` | `runtime: "auto"` | Still PI unless a plugin explicitly claims `openai-codex` | Codex sign-in |
| `openai/gpt-5.5` | `agentRuntime.id: "codex"` | Codex app-server harness | Codex app-server auth |
<Note>
Keep using the `openai-codex` provider id for auth/profile commands. The
`openai-codex/*` model prefix is also the explicit PI route for Codex OAuth.
It does not select or auto-enable the bundled Codex app-server harness.
It does not select or auto-enable the bundled Codex app-server harness. For
the common subscription plus native runtime setup, sign in with
`openai-codex` but keep the model ref as `openai/gpt-5.5` and set
`agentRuntime.id: "codex"`.
</Note>
### Config example
```json5
{
agents: { defaults: { model: { primary: "openai-codex/gpt-5.5" } } },
plugins: { entries: { codex: { enabled: true } } },
agents: {
defaults: {
model: { primary: "openai/gpt-5.5" },
agentRuntime: { id: "codex", fallback: "none" },
},
},
}
```
To keep Codex OAuth on the normal PI runner instead, use
`openai-codex/gpt-5.5` and omit the Codex runtime override.
<Note>
Onboarding no longer imports OAuth material from `~/.codex`. Sign in with browser OAuth (default) or the device-code flow above — OpenClaw manages the resulting credentials in its own agent auth store.
</Note>
@@ -241,12 +259,11 @@ Choose your preferred auth method and follow the setup steps.
### Doctor warning
If the bundled `codex` plugin is enabled while this tab's
`openai-codex/*` route is selected, `openclaw doctor` warns that the model
still resolves through PI. Keep the config unchanged when that is the
intended subscription-auth route. Switch to `openai/<model>` plus
`agentRuntime.id: "codex"` only when you want native Codex
app-server execution.
If the bundled `codex` plugin is enabled while an `openai-codex/*` route is
selected, `openclaw doctor` warns that the model still resolves through PI.
Keep the config unchanged only when that PI subscription-auth route is
intentional. Switch to `openai/<model>` plus `agentRuntime.id: "codex"` when
you want native Codex app-server execution.
### Context window cap