mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 12:10:42 +00:00
docs: clarify codex plugin auto-enable boundary
This commit is contained in:
@@ -21,6 +21,9 @@ For model selection rules, see [/concepts/models](/concepts/models).
|
||||
and `openai/<model>` plus `agents.defaults.embeddedHarness.runtime: "codex"`
|
||||
uses the native Codex app-server harness. See [OpenAI](/providers/openai)
|
||||
and [Codex harness](/plugins/codex-harness).
|
||||
- Plugin auto-enable follows that same boundary: `openai-codex/<model>` belongs
|
||||
to the OpenAI plugin, while the Codex plugin is enabled by
|
||||
`embeddedHarness.runtime: "codex"` or legacy `codex/<model>` refs.
|
||||
- GPT-5.5 is currently available through subscription/OAuth routes:
|
||||
`openai-codex/gpt-5.5` in PI or `openai/gpt-5.5` with the Codex app-server
|
||||
harness. The direct API-key route for `openai/gpt-5.5` is supported once
|
||||
@@ -110,6 +113,9 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- PI model ref: `openai-codex/gpt-5.5`
|
||||
- Native Codex app-server harness ref: `openai/gpt-5.5` with `agents.defaults.embeddedHarness.runtime: "codex"`
|
||||
- Legacy model refs: `codex/gpt-*`
|
||||
- Plugin boundary: `openai-codex/*` loads the OpenAI plugin; the native Codex
|
||||
app-server plugin is selected only by the Codex harness runtime or legacy
|
||||
`codex/*` refs.
|
||||
- CLI: `openclaw onboard --auth-choice openai-codex` or `openclaw models auth login --provider openai-codex`
|
||||
- Default transport is `auto` (WebSocket-first, SSE fallback)
|
||||
- Override per PI model via `agents.defaults.models["openai-codex/<model>"].params.transport` (`"sse"`, `"websocket"`, or `"auto"`)
|
||||
|
||||
@@ -23,6 +23,13 @@ supported once OpenAI enables GPT-5.5 on the public API; until then use an
|
||||
API-enabled model such as `openai/gpt-5.4` for `OPENAI_API_KEY` setups.
|
||||
</Note>
|
||||
|
||||
<Note>
|
||||
Enabling the OpenAI plugin, or selecting an `openai-codex/*` model, does not
|
||||
enable the bundled Codex app-server plugin. OpenClaw enables that plugin only
|
||||
when you explicitly select the native Codex harness with
|
||||
`embeddedHarness.runtime: "codex"` or use a legacy `codex/*` model ref.
|
||||
</Note>
|
||||
|
||||
## OpenClaw feature coverage
|
||||
|
||||
| OpenAI capability | OpenClaw surface | Status |
|
||||
@@ -141,6 +148,7 @@ Choose your preferred auth method and follow the setup steps.
|
||||
<Note>
|
||||
Keep using the `openai-codex` provider id for auth/profile commands. The
|
||||
`openai-codex/*` model prefix is also the explicit PI route for Codex OAuth.
|
||||
It does not select or auto-enable the bundled Codex app-server harness.
|
||||
</Note>
|
||||
|
||||
### Config example
|
||||
|
||||
@@ -189,6 +189,13 @@ OpenClaw scans for plugins in this order (first match wins):
|
||||
- Workspace-origin plugins are **disabled by default** (must be explicitly enabled)
|
||||
- Bundled plugins follow the built-in default-on set unless overridden
|
||||
- Exclusive slots can force-enable the selected plugin for that slot
|
||||
- Some bundled opt-in plugins are enabled automatically when config names a
|
||||
plugin-owned surface, such as a provider model ref, channel config, or harness
|
||||
runtime
|
||||
- OpenAI-family Codex routes keep separate plugin boundaries:
|
||||
`openai-codex/*` belongs to the OpenAI plugin, while the bundled Codex
|
||||
app-server plugin is selected by `embeddedHarness.runtime: "codex"` or legacy
|
||||
`codex/*` model refs
|
||||
|
||||
## Plugin slots (exclusive categories)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user