mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 18:10:45 +00:00
docs: clarify codex runtime routing
This commit is contained in:
@@ -27,6 +27,24 @@ harness implements the `codex` runtime. The config key is still named
|
||||
`embeddedHarness` for compatibility, but user-facing docs and status output
|
||||
should generally say runtime.
|
||||
|
||||
## Three things named Codex
|
||||
|
||||
Most confusion comes from three different surfaces sharing the Codex name:
|
||||
|
||||
| Surface | OpenClaw name/config | What it does |
|
||||
| ---------------------------------------------------- | ------------------------------------ | --------------------------------------------------------------------------------------------------- |
|
||||
| Codex OAuth provider route | `openai-codex/*` model refs | Uses ChatGPT/Codex subscription OAuth through the normal OpenClaw PI runner. |
|
||||
| Native Codex app-server runtime | `embeddedHarness.runtime: "codex"` | Runs the embedded agent turn through the bundled Codex app-server harness. |
|
||||
| Codex ACP adapter | `runtime: "acp"`, `agentId: "codex"` | Runs Codex through the external ACP/acpx control plane. Use only when ACP/acpx is explicitly asked. |
|
||||
| Native Codex chat-control command set | `/codex ...` | Binds, resumes, steers, stops, and inspects Codex app-server threads from chat. |
|
||||
| OpenAI Platform API route for GPT/Codex-style models | `openai/*` model refs | Uses OpenAI API-key auth unless a runtime override, such as `runtime: "codex"`, runs the turn. |
|
||||
|
||||
Those surfaces are intentionally independent. Enabling the `codex` plugin makes
|
||||
the native app-server features available; it does not rewrite
|
||||
`openai-codex/*` into `openai/*`, does not change existing sessions, and does
|
||||
not make ACP the Codex default. Selecting `openai-codex/*` means "use the Codex
|
||||
OAuth provider route" unless you separately force a runtime.
|
||||
|
||||
The common Codex setup uses the `openai` provider with the `codex` runtime:
|
||||
|
||||
```json5
|
||||
@@ -53,6 +71,19 @@ Codex only when the user explicitly asks for ACP/acpx or is testing the ACP
|
||||
adapter path. Claude Code, Gemini CLI, OpenCode, Cursor, and similar external
|
||||
harnesses still use ACP.
|
||||
|
||||
This is the agent-facing decision tree:
|
||||
|
||||
1. If the user asks for **Codex bind/control/thread/resume/steer/stop**, use the
|
||||
native `/codex` command surface when the bundled `codex` plugin is enabled.
|
||||
2. If the user asks for **Codex as the embedded runtime**, use
|
||||
`openai/<model>` with `embeddedHarness.runtime: "codex"`.
|
||||
3. If the user asks for **Codex OAuth/subscription auth on the normal OpenClaw
|
||||
runner**, use `openai-codex/<model>` and leave the runtime as PI.
|
||||
4. If the user explicitly says **ACP**, **acpx**, or **Codex ACP adapter**, use
|
||||
ACP with `runtime: "acp"` and `agentId: "codex"`.
|
||||
5. If the request is for **Claude Code, Gemini CLI, OpenCode, Cursor, Droid, or
|
||||
another external harness**, use ACP/acpx, not the native sub-agent runtime.
|
||||
|
||||
| You mean... | Use... |
|
||||
| --------------------------------------- | -------------------------------------------- |
|
||||
| Codex app-server chat/thread control | `/codex ...` from the bundled `codex` plugin |
|
||||
@@ -106,6 +137,18 @@ Explicit plugin runtimes fail closed by default. For example,
|
||||
a broader fallback setting, so an agent-level `runtime: "codex"` is not silently
|
||||
routed back to PI just because defaults used `fallback: "pi"`.
|
||||
|
||||
`auto` mode is intentionally conservative. Plugin runtimes can claim
|
||||
provider/model pairs they understand, but the Codex plugin does not claim the
|
||||
`openai-codex` provider in `auto` mode. That keeps
|
||||
`openai-codex/*` as the explicit PI Codex OAuth route and avoids silently
|
||||
moving subscription-auth configs onto the native app-server harness.
|
||||
|
||||
If `openclaw doctor` warns that the `codex` plugin is enabled while
|
||||
`openai-codex/*` still routes through PI, treat that as a diagnosis, not a
|
||||
migration. Keep the config unchanged when PI Codex OAuth is what you want.
|
||||
Switch to `openai/<model>` plus `runtime: "codex"` only when you want native
|
||||
Codex app-server execution.
|
||||
|
||||
## Compatibility contract
|
||||
|
||||
When a runtime is not PI, it should document what OpenClaw surfaces it supports.
|
||||
|
||||
Reference in New Issue
Block a user