From 9606bb27d54ac12369d6adf32268c3a7a601bb2e Mon Sep 17 00:00:00 2001 From: FullerStackDev <263060202+fuller-stack-dev@users.noreply.github.com> Date: Sat, 2 May 2026 18:22:20 -0600 Subject: [PATCH] docs: credit message latency fix contributor --- CHANGELOG.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index 7773bebeca5..2e906e4317a 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -98,7 +98,7 @@ Docs: https://docs.openclaw.ai ### Fixes -- CLI/message: skip eager model context warmup and preserve channel-declared gateway execution for Discord and Telegram message actions, avoiding Codex app-server/model discovery during simple send/read commands. +- CLI/message: skip eager model context warmup and preserve channel-declared gateway execution for Discord and Telegram message actions, avoiding Codex app-server/model discovery during simple send/read commands. Thanks @fuller-stack-dev. - Codex/app-server: resolve managed binaries from bundled `dist` chunks and from the `@openai/codex` package bin when installs do not provide a nearby `.bin/codex` shim, avoiding false missing-binary startup failures. - Plugins/ClawHub: use the ClawHub artifact resolver response as the install decision before downloading, keeping legacy ZIP fallback and future ClawPack npm-pack installs on the same explicit resolver path. Thanks @vincentkoc. - Plugins/ClawHub: keep bare plugin package specs on npm for the launch cutover and reserve ClawHub resolution for explicit `clawhub:` specs until ClawHub pack readiness is deployed. Thanks @vincentkoc.