docs: remove legacy cache retention notes

This commit is contained in:
Peter Steinberger
2026-04-04 15:26:08 +09:00
parent cb6d0576be
commit 195e380e05
2 changed files with 5 additions and 26 deletions

View File

@@ -95,11 +95,11 @@ OpenClaw supports Anthropic's prompt caching feature. This is **API-only**; subs
Use the `cacheRetention` parameter in your model config:
| Value | Cache Duration | Description |
| ------- | -------------- | ----------------------------------- |
| `none` | No caching | Disable prompt caching |
| `short` | 5 minutes | Default for API Key auth |
| `long` | 1 hour | Extended cache (requires beta flag) |
| Value | Cache Duration | Description |
| ------- | -------------- | ------------------------ |
| `none` | No caching | Disable prompt caching |
| `short` | 5 minutes | Default for API Key auth |
| `long` | 1 hour | Extended cache |
```json5
{
@@ -155,18 +155,6 @@ This lets one agent keep a long-lived cache while another agent on the same mode
- Non-Anthropic Bedrock models are forced to `cacheRetention: "none"` at runtime.
- Anthropic API-key smart defaults also seed `cacheRetention: "short"` for Claude-on-Bedrock model refs when no explicit value is set.
### Legacy parameter
The older `cacheControlTtl` parameter is still supported for backwards compatibility:
- `"5m"` maps to `short`
- `"1h"` maps to `long`
We recommend migrating to the new `cacheRetention` parameter.
OpenClaw includes the `extended-cache-ttl-2025-04-11` beta flag for Anthropic API
requests; keep it if you override provider headers (see [/gateway/configuration](/gateway/configuration)).
## 1M context window (Anthropic beta)
Anthropic's 1M context window is beta-gated. In OpenClaw, enable it per model

View File

@@ -62,15 +62,6 @@ Config merge order:
2. `agents.defaults.models["provider/model"].params` (per-model override)
3. `agents.list[].params` (matching agent id; overrides by key)
### Legacy `cacheControlTtl`
Legacy values are still accepted and mapped:
- `5m` -> `short`
- `1h` -> `long`
Prefer `cacheRetention` for new config.
### `contextPruning.mode: "cache-ttl"`
Prunes old tool-result context after cache TTL windows so post-idle requests do not re-cache oversized history.