mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 21:10:43 +00:00
fix(models): separate Codex harness from model choices (#71193)
* fix: separate Codex harness from model choices * docs: note Codex harness model choice fix
This commit is contained in:
committed by
GitHub
parent
dcf01ce72f
commit
bc0f54bd04
@@ -369,7 +369,7 @@ Time format in system prompt. Default: `auto` (OS preference).
|
||||
- For direct OpenAI Responses models, server-side compaction is enabled automatically. Use `params.responsesServerCompaction: false` to stop injecting `context_management`, or `params.responsesCompactThreshold` to override the threshold. See [OpenAI server-side compaction](/providers/openai#server-side-compaction-responses-api).
|
||||
- `params`: global default provider parameters applied to all models. Set at `agents.defaults.params` (e.g. `{ cacheRetention: "long" }`).
|
||||
- `params` merge precedence (config): `agents.defaults.params` (global base) is overridden by `agents.defaults.models["provider/model"].params` (per-model), then `agents.list[].params` (matching agent id) overrides by key. See [Prompt Caching](/reference/prompt-caching) for details.
|
||||
- `embeddedHarness`: default low-level embedded agent runtime policy. Use `runtime: "auto"` to let registered plugin harnesses claim supported models, `runtime: "pi"` to force the built-in PI harness, or a registered harness id such as `runtime: "codex"`. Set `fallback: "none"` to disable automatic PI fallback.
|
||||
- `embeddedHarness`: default low-level embedded agent runtime policy. Use `runtime: "auto"` to let registered plugin harnesses claim supported models, `runtime: "pi"` to force the built-in PI harness, or a registered harness id such as `runtime: "codex"`. Set `fallback: "none"` to disable automatic PI fallback. New Codex harness configs should keep model refs canonical as `openai/*` and select the harness here rather than using legacy `codex/*` model refs.
|
||||
- Config writers that mutate these fields (for example `/models set`, `/models set-image`, and fallback add/remove commands) save canonical object form and preserve existing fallback lists when possible.
|
||||
- `maxConcurrent`: max parallel agent runs across sessions (each session still serialized). Default: 4.
|
||||
|
||||
|
||||
@@ -3,7 +3,7 @@ summary: "Run OpenClaw embedded agent turns through the bundled Codex app-server
|
||||
title: "Codex harness"
|
||||
read_when:
|
||||
- You want to use the bundled Codex app-server harness
|
||||
- You need Codex model refs and config examples
|
||||
- You need Codex harness config examples
|
||||
- You want to disable PI fallback for Codex-only deployments
|
||||
---
|
||||
|
||||
@@ -35,7 +35,8 @@ The harness is off by default. New configs should keep OpenAI model refs
|
||||
canonical as `openai/gpt-*` and explicitly force
|
||||
`embeddedHarness.runtime: "codex"` or `OPENCLAW_AGENT_RUNTIME=codex` when they
|
||||
want native app-server execution. Legacy `codex/*` model refs still auto-select
|
||||
the harness for compatibility.
|
||||
the harness for compatibility, but they are not shown as normal model/provider
|
||||
choices.
|
||||
|
||||
## Pick the right model prefix
|
||||
|
||||
@@ -54,10 +55,11 @@ GPT-5.5 is currently subscription/OAuth-only in OpenClaw. Use
|
||||
app-server harness. Direct API-key access for `openai/gpt-5.5` is supported
|
||||
once OpenAI enables GPT-5.5 on the public API.
|
||||
|
||||
Legacy `codex/gpt-*` refs remain accepted as compatibility aliases. New PI
|
||||
Codex OAuth configs should use `openai-codex/gpt-*`; new native app-server
|
||||
harness configs should use `openai/gpt-*` plus `embeddedHarness.runtime:
|
||||
"codex"`.
|
||||
Legacy `codex/gpt-*` refs remain accepted as compatibility aliases. Doctor
|
||||
compatibility migration rewrites legacy primary `codex/*` refs to `openai/*`
|
||||
and records the Codex harness policy separately. New PI Codex OAuth configs
|
||||
should use `openai-codex/gpt-*`; new native app-server harness configs should
|
||||
use `openai/gpt-*` plus `embeddedHarness.runtime: "codex"`.
|
||||
|
||||
`agents.defaults.imageModel` follows the same prefix split. Use
|
||||
`openai-codex/gpt-*` when image understanding should run through the OpenAI
|
||||
|
||||
Reference in New Issue
Block a user