mirror of
https://github.com/openclaw/openclaw.git
synced 2026-05-06 11:50:43 +00:00
fix(models): separate Codex harness from model choices (#71193)
* fix: separate Codex harness from model choices * docs: note Codex harness model choice fix
This commit is contained in:
committed by
GitHub
parent
dcf01ce72f
commit
bc0f54bd04
@@ -3,7 +3,7 @@ summary: "Run OpenClaw embedded agent turns through the bundled Codex app-server
|
||||
title: "Codex harness"
|
||||
read_when:
|
||||
- You want to use the bundled Codex app-server harness
|
||||
- You need Codex model refs and config examples
|
||||
- You need Codex harness config examples
|
||||
- You want to disable PI fallback for Codex-only deployments
|
||||
---
|
||||
|
||||
@@ -35,7 +35,8 @@ The harness is off by default. New configs should keep OpenAI model refs
|
||||
canonical as `openai/gpt-*` and explicitly force
|
||||
`embeddedHarness.runtime: "codex"` or `OPENCLAW_AGENT_RUNTIME=codex` when they
|
||||
want native app-server execution. Legacy `codex/*` model refs still auto-select
|
||||
the harness for compatibility.
|
||||
the harness for compatibility, but they are not shown as normal model/provider
|
||||
choices.
|
||||
|
||||
## Pick the right model prefix
|
||||
|
||||
@@ -54,10 +55,11 @@ GPT-5.5 is currently subscription/OAuth-only in OpenClaw. Use
|
||||
app-server harness. Direct API-key access for `openai/gpt-5.5` is supported
|
||||
once OpenAI enables GPT-5.5 on the public API.
|
||||
|
||||
Legacy `codex/gpt-*` refs remain accepted as compatibility aliases. New PI
|
||||
Codex OAuth configs should use `openai-codex/gpt-*`; new native app-server
|
||||
harness configs should use `openai/gpt-*` plus `embeddedHarness.runtime:
|
||||
"codex"`.
|
||||
Legacy `codex/gpt-*` refs remain accepted as compatibility aliases. Doctor
|
||||
compatibility migration rewrites legacy primary `codex/*` refs to `openai/*`
|
||||
and records the Codex harness policy separately. New PI Codex OAuth configs
|
||||
should use `openai-codex/gpt-*`; new native app-server harness configs should
|
||||
use `openai/gpt-*` plus `embeddedHarness.runtime: "codex"`.
|
||||
|
||||
`agents.defaults.imageModel` follows the same prefix split. Use
|
||||
`openai-codex/gpt-*` when image understanding should run through the OpenAI
|
||||
|
||||
Reference in New Issue
Block a user