Files
openclaw/extensions/openai/openai-codex-catalog.ts
methazoo 8a2d7f2541 fix(openai-codex): use /backend-api/codex/ base URL
OpenAI removed the /backend-api/responses alias on chatgpt.com server-side.
The OpenAI SDK appends /responses to the configured baseUrl, so OpenClaw's
current baseUrl ("https://chatgpt.com/backend-api") now resolves to
/backend-api/responses and hits a Cloudflare HTML 403 block page. The
provider's 403+HTML error classifier then surfaces this as an auth-scope
failure, triggering fruitless OAuth re-login loops for every GPT-5.4
sub-agent call.

- Point OPENAI_CODEX_BASE_URL at https://chatgpt.com/backend-api/codex
  (both the catalog constant and the sibling local constant in the provider).
- Extend isOpenAICodexBaseUrl to accept the new /codex segment while keeping
  the legacy path recognized so pre-existing user configs and persisted
  model metadata still round-trip through the normalizer correctly.
- Add positive-case test coverage for the new base URL; update existing
  normalization tests whose expected canonical output now includes /codex.

Verified with live curl using the exact OAuth access token stored by
OpenClaw: the /codex/responses path returns HTTP 200 with streaming SSE,
while the old /responses alias returns HTTP 403 HTML regardless of auth
headers. Scoped tests (base-url, openai-codex-provider, transport-policy,
openai-provider, index) pass; pnpm tsgo and pnpm build are clean.
2026-04-21 03:19:58 +01:00

12 lines
337 B
TypeScript

import type { ModelProviderConfig } from "openclaw/plugin-sdk/provider-model-shared";
export const OPENAI_CODEX_BASE_URL = "https://chatgpt.com/backend-api/codex";
export function buildOpenAICodexProvider(): ModelProviderConfig {
return {
baseUrl: OPENAI_CODEX_BASE_URL,
api: "openai-codex-responses",
models: [],
};
}