Files
openclaw/extensions/amazon-bedrock
anirudhmarc 24266af1ce fix(amazon-bedrock): inject cache points for application inference profile ARNs (#69953)
* fix(amazon-bedrock): inject cache points for application inference profile ARNs

pi-ai's internal supportsPromptCaching checks model.id for specific Claude
model name patterns (e.g. "-4-", "claude-3-7-sonnet"), which fails for
application inference profile ARNs that don't contain the model name.
This causes prompt caching to silently break for Bedrock users with
application inference profiles.

Work around this by detecting when pi-ai would miss cache point injection
(via piAiWouldInjectCachePoints mirror) and patching the Converse API
payload via onPayload to add cachePoint blocks to the system prompt and
last user message — matching the same format pi-ai uses natively.

The fix is safe:
- Checks for existing cache points to avoid double-injection
- Respects cacheRetention: "none"
- Defaults to "short" retention (matching pi-ai default)
- Becomes a no-op once upstream pi-mono#2925 is fixed

Fixes #19279
Upstream: https://github.com/badlogic/pi-mono/issues/2925

* fix(amazon-bedrock): tighten app-profile cache injection

---------

Co-authored-by: Your Name <you@example.com>
Co-authored-by: Vincent Koc <vincentkoc@ieee.org>
2026-04-22 12:19:29 -07:00
..
2026-04-20 13:16:40 +01:00