mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 17:00:50 +00:00
Refresh the Codex runtime docs
Refresh the Codex runtime docs and cross-link the Codex harness, OpenAI provider, agent runtime, plugin hook, ACP agent, and status pages.
This commit is contained in:
@@ -46,6 +46,10 @@ That means OpenClaw selects an OpenAI model ref, then asks the Codex app-server
|
||||
runtime to run the embedded agent turn. It does not mean the channel, model
|
||||
provider catalog, or OpenClaw session store becomes Codex.
|
||||
|
||||
For the OpenAI-family prefix split, see [OpenAI](/providers/openai) and
|
||||
[Model providers](/concepts/model-providers). For the Codex runtime support
|
||||
contract, see [Codex harness](/plugins/codex-harness#v1-support-contract).
|
||||
|
||||
## Runtime ownership
|
||||
|
||||
Different runtimes own different amounts of the loop.
|
||||
@@ -84,7 +88,9 @@ OpenClaw chooses an embedded runtime after provider and model resolution:
|
||||
|
||||
Explicit plugin runtimes fail closed by default. For example,
|
||||
`runtime: "codex"` means Codex or a clear selection error unless you set
|
||||
`fallback: "pi"` in the same override scope.
|
||||
`fallback: "pi"` in the same override scope. A runtime override does not inherit
|
||||
a broader fallback setting, so an agent-level `runtime: "codex"` is not silently
|
||||
routed back to PI just because defaults used `fallback: "pi"`.
|
||||
|
||||
## Compatibility contract
|
||||
|
||||
@@ -122,6 +128,8 @@ session systems.
|
||||
## Related
|
||||
|
||||
- [Codex harness](/plugins/codex-harness)
|
||||
- [OpenAI](/providers/openai)
|
||||
- [Agent harness plugins](/plugins/sdk-agent-harness)
|
||||
- [Agent loop](/concepts/agent-loop)
|
||||
- [Models](/concepts/models)
|
||||
- [Status](/cli/status)
|
||||
|
||||
@@ -20,7 +20,8 @@ For model selection rules, see [/concepts/models](/concepts/models).
|
||||
OpenAI API-key provider in PI, `openai-codex/<model>` uses Codex OAuth in PI,
|
||||
and `openai/<model>` plus `agents.defaults.embeddedHarness.runtime: "codex"`
|
||||
uses the native Codex app-server harness. See [OpenAI](/providers/openai)
|
||||
and [Codex harness](/plugins/codex-harness).
|
||||
and [Codex harness](/plugins/codex-harness). If the provider/runtime split is
|
||||
confusing, read [Agent runtimes](/concepts/agent-runtimes) first.
|
||||
- Plugin auto-enable follows that same boundary: `openai-codex/<model>` belongs
|
||||
to the OpenAI plugin, while the Codex plugin is enabled by
|
||||
`embeddedHarness.runtime: "codex"` or legacy `codex/<model>` refs.
|
||||
@@ -75,6 +76,8 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Optional rotation: `OPENAI_API_KEYS`, `OPENAI_API_KEY_1`, `OPENAI_API_KEY_2`, plus `OPENCLAW_LIVE_OPENAI_KEY` (single override)
|
||||
- Example models: `openai/gpt-5.4`, `openai/gpt-5.4-mini`
|
||||
- GPT-5.5 direct API support is future-ready here once OpenAI exposes GPT-5.5 on the API
|
||||
- Verify direct API availability with `openclaw models list --provider openai`
|
||||
before using `openai/gpt-5.5` without the Codex app-server runtime
|
||||
- CLI: `openclaw onboard --auth-choice openai-api-key`
|
||||
- Default transport is `auto` (WebSocket-first, SSE fallback)
|
||||
- Override per model via `agents.defaults.models["openai/<model>"].params.transport` (`"sse"`, `"websocket"`, or `"auto"`)
|
||||
@@ -118,6 +121,7 @@ OpenClaw ships with the pi‑ai catalog. These providers require **no**
|
||||
- Auth: OAuth (ChatGPT)
|
||||
- PI model ref: `openai-codex/gpt-5.5`
|
||||
- Native Codex app-server harness ref: `openai/gpt-5.5` with `agents.defaults.embeddedHarness.runtime: "codex"`
|
||||
- Native Codex app-server harness docs: [Codex harness](/plugins/codex-harness)
|
||||
- Legacy model refs: `codex/gpt-*`
|
||||
- Plugin boundary: `openai-codex/*` loads the OpenAI plugin; the native Codex
|
||||
app-server plugin is selected only by the Codex harness runtime or legacy
|
||||
|
||||
Reference in New Issue
Block a user