diff --git a/docs/concepts/model-providers.md b/docs/concepts/model-providers.md index 56513fd9032..04a6292661b 100644 --- a/docs/concepts/model-providers.md +++ b/docs/concepts/model-providers.md @@ -21,6 +21,9 @@ For model selection rules, see [/concepts/models](/concepts/models). and `openai/` plus `agents.defaults.embeddedHarness.runtime: "codex"` uses the native Codex app-server harness. See [OpenAI](/providers/openai) and [Codex harness](/plugins/codex-harness). +- Plugin auto-enable follows that same boundary: `openai-codex/` belongs + to the OpenAI plugin, while the Codex plugin is enabled by + `embeddedHarness.runtime: "codex"` or legacy `codex/` refs. - GPT-5.5 is currently available through subscription/OAuth routes: `openai-codex/gpt-5.5` in PI or `openai/gpt-5.5` with the Codex app-server harness. The direct API-key route for `openai/gpt-5.5` is supported once @@ -110,6 +113,9 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no** - PI model ref: `openai-codex/gpt-5.5` - Native Codex app-server harness ref: `openai/gpt-5.5` with `agents.defaults.embeddedHarness.runtime: "codex"` - Legacy model refs: `codex/gpt-*` +- Plugin boundary: `openai-codex/*` loads the OpenAI plugin; the native Codex + app-server plugin is selected only by the Codex harness runtime or legacy + `codex/*` refs. - CLI: `openclaw onboard --auth-choice openai-codex` or `openclaw models auth login --provider openai-codex` - Default transport is `auto` (WebSocket-first, SSE fallback) - Override per PI model via `agents.defaults.models["openai-codex/"].params.transport` (`"sse"`, `"websocket"`, or `"auto"`) diff --git a/docs/providers/openai.md b/docs/providers/openai.md index ce1c6b7e0d4..d75fa6de0f4 100644 --- a/docs/providers/openai.md +++ b/docs/providers/openai.md @@ -23,6 +23,13 @@ supported once OpenAI enables GPT-5.5 on the public API; until then use an API-enabled model such as `openai/gpt-5.4` for `OPENAI_API_KEY` setups. + +Enabling the OpenAI plugin, or selecting an `openai-codex/*` model, does not +enable the bundled Codex app-server plugin. OpenClaw enables that plugin only +when you explicitly select the native Codex harness with +`embeddedHarness.runtime: "codex"` or use a legacy `codex/*` model ref. + + ## OpenClaw feature coverage | OpenAI capability | OpenClaw surface | Status | @@ -141,6 +148,7 @@ Choose your preferred auth method and follow the setup steps. Keep using the `openai-codex` provider id for auth/profile commands. The `openai-codex/*` model prefix is also the explicit PI route for Codex OAuth. + It does not select or auto-enable the bundled Codex app-server harness. ### Config example diff --git a/docs/tools/plugin.md b/docs/tools/plugin.md index 3da2fb28bb0..ef64601fbdd 100644 --- a/docs/tools/plugin.md +++ b/docs/tools/plugin.md @@ -189,6 +189,13 @@ OpenClaw scans for plugins in this order (first match wins): - Workspace-origin plugins are **disabled by default** (must be explicitly enabled) - Bundled plugins follow the built-in default-on set unless overridden - Exclusive slots can force-enable the selected plugin for that slot +- Some bundled opt-in plugins are enabled automatically when config names a + plugin-owned surface, such as a provider model ref, channel config, or harness + runtime +- OpenAI-family Codex routes keep separate plugin boundaries: + `openai-codex/*` belongs to the OpenAI plugin, while the bundled Codex + app-server plugin is selected by `embeddedHarness.runtime: "codex"` or legacy + `codex/*` model refs ## Plugin slots (exclusive categories)