mirror of
https://github.com/openclaw/openclaw.git
synced 2026-03-12 07:20:45 +00:00
Session/Cron maintenance hardening and cleanup UX (#24753)
Merged via /review-pr -> /prepare-pr -> /merge-pr.
Prepared head SHA: 7533b85156
Co-authored-by: gumadeiras <5599352+gumadeiras@users.noreply.github.com>
Co-authored-by: shakkernerd <165377636+shakkernerd@users.noreply.github.com>
Reviewed-by: @shakkernerd
This commit is contained in:
committed by
GitHub
parent
29b19455e3
commit
eff3c5c707
@@ -11,6 +11,7 @@ Docs: https://docs.openclaw.ai
|
||||
- Providers/Vercel AI Gateway: accept Claude shorthand model refs (`vercel-ai-gateway/claude-*`) by normalizing to canonical Anthropic-routed model ids. (#23985) Thanks @sallyom, @markbooch, and @vincentkoc.
|
||||
- Docs/Prompt caching: add a dedicated prompt-caching reference covering `cacheRetention`, per-agent `params` merge precedence, Bedrock/OpenRouter behavior, and cache-ttl + heartbeat tuning. Thanks @svenssonaxel.
|
||||
- Gateway/HTTP security headers: add optional `gateway.http.securityHeaders.strictTransportSecurity` support to emit `Strict-Transport-Security` for direct HTTPS deployments, with runtime wiring, validation, tests, and hardening docs.
|
||||
- Sessions/Cron: harden session maintenance with `openclaw sessions cleanup`, per-agent store targeting, disk-budget controls (`session.maintenance.maxDiskBytes` / `highWaterBytes`), and safer transcript/archive cleanup + run-log retention behavior. (#24753) thanks @gumadeiras.
|
||||
|
||||
### Breaking
|
||||
|
||||
|
||||
@@ -349,7 +349,8 @@ Notes:
|
||||
## Storage & history
|
||||
|
||||
- Job store: `~/.openclaw/cron/jobs.json` (Gateway-managed JSON).
|
||||
- Run history: `~/.openclaw/cron/runs/<jobId>.jsonl` (JSONL, auto-pruned).
|
||||
- Run history: `~/.openclaw/cron/runs/<jobId>.jsonl` (JSONL, auto-pruned by size and line count).
|
||||
- Isolated cron run sessions in `sessions.json` are pruned by `cron.sessionRetention` (default `24h`; set `false` to disable).
|
||||
- Override store path: `cron.store` in config.
|
||||
|
||||
## Configuration
|
||||
@@ -362,10 +363,21 @@ Notes:
|
||||
maxConcurrentRuns: 1, // default 1
|
||||
webhook: "https://example.invalid/legacy", // deprecated fallback for stored notify:true jobs
|
||||
webhookToken: "replace-with-dedicated-webhook-token", // optional bearer token for webhook mode
|
||||
sessionRetention: "24h", // duration string or false
|
||||
runLog: {
|
||||
maxBytes: "2mb", // default 2_000_000 bytes
|
||||
keepLines: 2000, // default 2000
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
Run-log pruning behavior:
|
||||
|
||||
- `cron.runLog.maxBytes`: max run-log file size before pruning.
|
||||
- `cron.runLog.keepLines`: when pruning, keep only the newest N lines.
|
||||
- Both apply to `cron/runs/<jobId>.jsonl` files.
|
||||
|
||||
Webhook behavior:
|
||||
|
||||
- Preferred: set `delivery.mode: "webhook"` with `delivery.to: "https://..."` per job.
|
||||
@@ -380,6 +392,85 @@ Disable cron entirely:
|
||||
- `cron.enabled: false` (config)
|
||||
- `OPENCLAW_SKIP_CRON=1` (env)
|
||||
|
||||
## Maintenance
|
||||
|
||||
Cron has two built-in maintenance paths: isolated run-session retention and run-log pruning.
|
||||
|
||||
### Defaults
|
||||
|
||||
- `cron.sessionRetention`: `24h` (set `false` to disable run-session pruning)
|
||||
- `cron.runLog.maxBytes`: `2_000_000` bytes
|
||||
- `cron.runLog.keepLines`: `2000`
|
||||
|
||||
### How it works
|
||||
|
||||
- Isolated runs create session entries (`...:cron:<jobId>:run:<uuid>`) and transcript files.
|
||||
- The reaper removes expired run-session entries older than `cron.sessionRetention`.
|
||||
- For removed run sessions no longer referenced by the session store, OpenClaw archives transcript files and purges old deleted archives on the same retention window.
|
||||
- After each run append, `cron/runs/<jobId>.jsonl` is size-checked:
|
||||
- if file size exceeds `runLog.maxBytes`, it is trimmed to the newest `runLog.keepLines` lines.
|
||||
|
||||
### Performance caveat for high volume schedulers
|
||||
|
||||
High-frequency cron setups can generate large run-session and run-log footprints. Maintenance is built in, but loose limits can still create avoidable IO and cleanup work.
|
||||
|
||||
What to watch:
|
||||
|
||||
- long `cron.sessionRetention` windows with many isolated runs
|
||||
- high `cron.runLog.keepLines` combined with large `runLog.maxBytes`
|
||||
- many noisy recurring jobs writing to the same `cron/runs/<jobId>.jsonl`
|
||||
|
||||
What to do:
|
||||
|
||||
- keep `cron.sessionRetention` as short as your debugging/audit needs allow
|
||||
- keep run logs bounded with moderate `runLog.maxBytes` and `runLog.keepLines`
|
||||
- move noisy background jobs to isolated mode with delivery rules that avoid unnecessary chatter
|
||||
- review growth periodically with `openclaw cron runs` and adjust retention before logs become large
|
||||
|
||||
### Customize examples
|
||||
|
||||
Keep run sessions for a week and allow bigger run logs:
|
||||
|
||||
```json5
|
||||
{
|
||||
cron: {
|
||||
sessionRetention: "7d",
|
||||
runLog: {
|
||||
maxBytes: "10mb",
|
||||
keepLines: 5000,
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
Disable isolated run-session pruning but keep run-log pruning:
|
||||
|
||||
```json5
|
||||
{
|
||||
cron: {
|
||||
sessionRetention: false,
|
||||
runLog: {
|
||||
maxBytes: "5mb",
|
||||
keepLines: 3000,
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
Tune for high-volume cron usage (example):
|
||||
|
||||
```json5
|
||||
{
|
||||
cron: {
|
||||
sessionRetention: "12h",
|
||||
runLog: {
|
||||
maxBytes: "3mb",
|
||||
keepLines: 1500,
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
## CLI quickstart
|
||||
|
||||
One-shot reminder (UTC ISO, auto-delete after success):
|
||||
|
||||
@@ -23,6 +23,11 @@ Note: one-shot (`--at`) jobs delete after success by default. Use `--keep-after-
|
||||
|
||||
Note: recurring jobs now use exponential retry backoff after consecutive errors (30s → 1m → 5m → 15m → 60m), then return to normal schedule after the next successful run.
|
||||
|
||||
Note: retention/pruning is controlled in config:
|
||||
|
||||
- `cron.sessionRetention` (default `24h`) prunes completed isolated run sessions.
|
||||
- `cron.runLog.maxBytes` + `cron.runLog.keepLines` prune `~/.openclaw/cron/runs/<jobId>.jsonl`.
|
||||
|
||||
## Common edits
|
||||
|
||||
Update delivery settings without changing the message:
|
||||
|
||||
@@ -27,6 +27,7 @@ Notes:
|
||||
|
||||
- Interactive prompts (like keychain/OAuth fixes) only run when stdin is a TTY and `--non-interactive` is **not** set. Headless runs (cron, Telegram, no terminal) will skip prompts.
|
||||
- `--fix` (alias for `--repair`) writes a backup to `~/.openclaw/openclaw.json.bak` and drops unknown config keys, listing each removal.
|
||||
- State integrity checks now detect orphan transcript files in the sessions directory and can archive them as `.deleted.<timestamp>` to reclaim space safely.
|
||||
|
||||
## macOS: `launchctl` env overrides
|
||||
|
||||
|
||||
@@ -11,6 +11,94 @@ List stored conversation sessions.
|
||||
|
||||
```bash
|
||||
openclaw sessions
|
||||
openclaw sessions --agent work
|
||||
openclaw sessions --all-agents
|
||||
openclaw sessions --active 120
|
||||
openclaw sessions --json
|
||||
```
|
||||
|
||||
Scope selection:
|
||||
|
||||
- default: configured default agent store
|
||||
- `--agent <id>`: one configured agent store
|
||||
- `--all-agents`: aggregate all configured agent stores
|
||||
- `--store <path>`: explicit store path (cannot be combined with `--agent` or `--all-agents`)
|
||||
|
||||
JSON examples:
|
||||
|
||||
`openclaw sessions --all-agents --json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"path": null,
|
||||
"stores": [
|
||||
{ "agentId": "main", "path": "/home/user/.openclaw/agents/main/sessions/sessions.json" },
|
||||
{ "agentId": "work", "path": "/home/user/.openclaw/agents/work/sessions/sessions.json" }
|
||||
],
|
||||
"allAgents": true,
|
||||
"count": 2,
|
||||
"activeMinutes": null,
|
||||
"sessions": [
|
||||
{ "agentId": "main", "key": "agent:main:main", "model": "gpt-5" },
|
||||
{ "agentId": "work", "key": "agent:work:main", "model": "claude-opus-4-5" }
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Cleanup maintenance
|
||||
|
||||
Run maintenance now (instead of waiting for the next write cycle):
|
||||
|
||||
```bash
|
||||
openclaw sessions cleanup --dry-run
|
||||
openclaw sessions cleanup --agent work --dry-run
|
||||
openclaw sessions cleanup --all-agents --dry-run
|
||||
openclaw sessions cleanup --enforce
|
||||
openclaw sessions cleanup --enforce --active-key "agent:main:telegram:dm:123"
|
||||
openclaw sessions cleanup --json
|
||||
```
|
||||
|
||||
`openclaw sessions cleanup` uses `session.maintenance` settings from config:
|
||||
|
||||
- Scope note: `openclaw sessions cleanup` maintains session stores/transcripts only. It does not prune cron run logs (`cron/runs/<jobId>.jsonl`), which are managed by `cron.runLog.maxBytes` and `cron.runLog.keepLines` in [Cron configuration](/automation/cron-jobs#configuration) and explained in [Cron maintenance](/automation/cron-jobs#maintenance).
|
||||
|
||||
- `--dry-run`: preview how many entries would be pruned/capped without writing.
|
||||
- In text mode, dry-run prints a per-session action table (`Action`, `Key`, `Age`, `Model`, `Flags`) so you can see what would be kept vs removed.
|
||||
- `--enforce`: apply maintenance even when `session.maintenance.mode` is `warn`.
|
||||
- `--active-key <key>`: protect a specific active key from disk-budget eviction.
|
||||
- `--agent <id>`: run cleanup for one configured agent store.
|
||||
- `--all-agents`: run cleanup for all configured agent stores.
|
||||
- `--store <path>`: run against a specific `sessions.json` file.
|
||||
- `--json`: print a JSON summary. With `--all-agents`, output includes one summary per store.
|
||||
|
||||
`openclaw sessions cleanup --all-agents --dry-run --json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"allAgents": true,
|
||||
"mode": "warn",
|
||||
"dryRun": true,
|
||||
"stores": [
|
||||
{
|
||||
"agentId": "main",
|
||||
"storePath": "/home/user/.openclaw/agents/main/sessions/sessions.json",
|
||||
"beforeCount": 120,
|
||||
"afterCount": 80,
|
||||
"pruned": 40,
|
||||
"capped": 0
|
||||
},
|
||||
{
|
||||
"agentId": "work",
|
||||
"storePath": "/home/user/.openclaw/agents/work/sessions/sessions.json",
|
||||
"beforeCount": 18,
|
||||
"afterCount": 18,
|
||||
"pruned": 0,
|
||||
"capped": 0
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Related:
|
||||
|
||||
- Session config: [Configuration reference](/gateway/configuration-reference#session)
|
||||
|
||||
@@ -71,6 +71,109 @@ All session state is **owned by the gateway** (the “master” OpenClaw). UI cl
|
||||
- Session entries include `origin` metadata (label + routing hints) so UIs can explain where a session came from.
|
||||
- OpenClaw does **not** read legacy Pi/Tau session folders.
|
||||
|
||||
## Maintenance
|
||||
|
||||
OpenClaw applies session-store maintenance to keep `sessions.json` and transcript artifacts bounded over time.
|
||||
|
||||
### Defaults
|
||||
|
||||
- `session.maintenance.mode`: `warn`
|
||||
- `session.maintenance.pruneAfter`: `30d`
|
||||
- `session.maintenance.maxEntries`: `500`
|
||||
- `session.maintenance.rotateBytes`: `10mb`
|
||||
- `session.maintenance.resetArchiveRetention`: defaults to `pruneAfter` (`30d`)
|
||||
- `session.maintenance.maxDiskBytes`: unset (disabled)
|
||||
- `session.maintenance.highWaterBytes`: defaults to `80%` of `maxDiskBytes` when budgeting is enabled
|
||||
|
||||
### How it works
|
||||
|
||||
Maintenance runs during session-store writes, and you can trigger it on demand with `openclaw sessions cleanup`.
|
||||
|
||||
- `mode: "warn"`: reports what would be evicted but does not mutate entries/transcripts.
|
||||
- `mode: "enforce"`: applies cleanup in this order:
|
||||
1. prune stale entries older than `pruneAfter`
|
||||
2. cap entry count to `maxEntries` (oldest first)
|
||||
3. archive transcript files for removed entries that are no longer referenced
|
||||
4. purge old `*.deleted.<timestamp>` and `*.reset.<timestamp>` archives by retention policy
|
||||
5. rotate `sessions.json` when it exceeds `rotateBytes`
|
||||
6. if `maxDiskBytes` is set, enforce disk budget toward `highWaterBytes` (oldest artifacts first, then oldest sessions)
|
||||
|
||||
### Performance caveat for large stores
|
||||
|
||||
Large session stores are common in high-volume setups. Maintenance work is write-path work, so very large stores can increase write latency.
|
||||
|
||||
What increases cost most:
|
||||
|
||||
- very high `session.maintenance.maxEntries` values
|
||||
- long `pruneAfter` windows that keep stale entries around
|
||||
- many transcript/archive artifacts in `~/.openclaw/agents/<agentId>/sessions/`
|
||||
- enabling disk budgets (`maxDiskBytes`) without reasonable pruning/cap limits
|
||||
|
||||
What to do:
|
||||
|
||||
- use `mode: "enforce"` in production so growth is bounded automatically
|
||||
- set both time and count limits (`pruneAfter` + `maxEntries`), not just one
|
||||
- set `maxDiskBytes` + `highWaterBytes` for hard upper bounds in large deployments
|
||||
- keep `highWaterBytes` meaningfully below `maxDiskBytes` (default is 80%)
|
||||
- run `openclaw sessions cleanup --dry-run --json` after config changes to verify projected impact before enforcing
|
||||
- for frequent active sessions, pass `--active-key` when running manual cleanup
|
||||
|
||||
### Customize examples
|
||||
|
||||
Use a conservative enforce policy:
|
||||
|
||||
```json5
|
||||
{
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "45d",
|
||||
maxEntries: 800,
|
||||
rotateBytes: "20mb",
|
||||
resetArchiveRetention: "14d",
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
Enable a hard disk budget for the sessions directory:
|
||||
|
||||
```json5
|
||||
{
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
maxDiskBytes: "1gb",
|
||||
highWaterBytes: "800mb",
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
Tune for larger installs (example):
|
||||
|
||||
```json5
|
||||
{
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "14d",
|
||||
maxEntries: 2000,
|
||||
rotateBytes: "25mb",
|
||||
maxDiskBytes: "2gb",
|
||||
highWaterBytes: "1.6gb",
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
Preview or force maintenance from CLI:
|
||||
|
||||
```bash
|
||||
openclaw sessions cleanup --dry-run
|
||||
openclaw sessions cleanup --enforce
|
||||
```
|
||||
|
||||
## Session pruning
|
||||
|
||||
OpenClaw trims **old tool results** from the in-memory context right before LLM calls by default.
|
||||
|
||||
@@ -169,6 +169,9 @@ Save to `~/.openclaw/openclaw.json` and you can DM the bot from that number.
|
||||
pruneAfter: "30d",
|
||||
maxEntries: 500,
|
||||
rotateBytes: "10mb",
|
||||
resetArchiveRetention: "30d", // duration or false
|
||||
maxDiskBytes: "500mb", // optional
|
||||
highWaterBytes: "400mb", // optional (defaults to 80% of maxDiskBytes)
|
||||
},
|
||||
typingIntervalSeconds: 5,
|
||||
sendPolicy: {
|
||||
@@ -355,6 +358,10 @@ Save to `~/.openclaw/openclaw.json` and you can DM the bot from that number.
|
||||
store: "~/.openclaw/cron/cron.json",
|
||||
maxConcurrentRuns: 2,
|
||||
sessionRetention: "24h",
|
||||
runLog: {
|
||||
maxBytes: "2mb",
|
||||
keepLines: 2000,
|
||||
},
|
||||
},
|
||||
|
||||
// Webhooks
|
||||
|
||||
@@ -1246,6 +1246,9 @@ See [Multi-Agent Sandbox & Tools](/tools/multi-agent-sandbox-tools) for preceden
|
||||
pruneAfter: "30d",
|
||||
maxEntries: 500,
|
||||
rotateBytes: "10mb",
|
||||
resetArchiveRetention: "30d", // duration or false
|
||||
maxDiskBytes: "500mb", // optional hard budget
|
||||
highWaterBytes: "400mb", // optional cleanup target
|
||||
},
|
||||
threadBindings: {
|
||||
enabled: true,
|
||||
@@ -1273,7 +1276,14 @@ See [Multi-Agent Sandbox & Tools](/tools/multi-agent-sandbox-tools) for preceden
|
||||
- **`resetByType`**: per-type overrides (`direct`, `group`, `thread`). Legacy `dm` accepted as alias for `direct`.
|
||||
- **`mainKey`**: legacy field. Runtime now always uses `"main"` for the main direct-chat bucket.
|
||||
- **`sendPolicy`**: match by `channel`, `chatType` (`direct|group|channel`, with legacy `dm` alias), `keyPrefix`, or `rawKeyPrefix`. First deny wins.
|
||||
- **`maintenance`**: `warn` warns the active session on eviction; `enforce` applies pruning and rotation.
|
||||
- **`maintenance`**: session-store cleanup + retention controls.
|
||||
- `mode`: `warn` emits warnings only; `enforce` applies cleanup.
|
||||
- `pruneAfter`: age cutoff for stale entries (default `30d`).
|
||||
- `maxEntries`: maximum number of entries in `sessions.json` (default `500`).
|
||||
- `rotateBytes`: rotate `sessions.json` when it exceeds this size (default `10mb`).
|
||||
- `resetArchiveRetention`: retention for `*.reset.<timestamp>` transcript archives. Defaults to `pruneAfter`; set `false` to disable.
|
||||
- `maxDiskBytes`: optional sessions-directory disk budget. In `warn` mode it logs warnings; in `enforce` mode it removes oldest artifacts/sessions first.
|
||||
- `highWaterBytes`: optional target after budget cleanup. Defaults to `80%` of `maxDiskBytes`.
|
||||
- **`threadBindings`**: global defaults for thread-bound session features.
|
||||
- `enabled`: master default switch (providers can override; Discord uses `channels.discord.threadBindings.enabled`)
|
||||
- `ttlHours`: default auto-unfocus TTL in hours (`0` disables; providers can override)
|
||||
@@ -2459,11 +2469,17 @@ Current builds no longer include the TCP bridge. Nodes connect over the Gateway
|
||||
webhook: "https://example.invalid/legacy", // deprecated fallback for stored notify:true jobs
|
||||
webhookToken: "replace-with-dedicated-token", // optional bearer token for outbound webhook auth
|
||||
sessionRetention: "24h", // duration string or false
|
||||
runLog: {
|
||||
maxBytes: "2mb", // default 2_000_000 bytes
|
||||
keepLines: 2000, // default 2000
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
- `sessionRetention`: how long to keep completed cron sessions before pruning. Default: `24h`.
|
||||
- `sessionRetention`: how long to keep completed isolated cron run sessions before pruning from `sessions.json`. Also controls cleanup of archived deleted cron transcripts. Default: `24h`; set `false` to disable.
|
||||
- `runLog.maxBytes`: max size per run log file (`cron/runs/<jobId>.jsonl`) before pruning. Default: `2_000_000` bytes.
|
||||
- `runLog.keepLines`: newest lines retained when run-log pruning is triggered. Default: `2000`.
|
||||
- `webhookToken`: bearer token used for cron webhook POST delivery (`delivery.mode = "webhook"`), if omitted no auth header is sent.
|
||||
- `webhook`: deprecated legacy fallback webhook URL (http/https) used only for stored jobs that still have `notify: true`.
|
||||
|
||||
|
||||
@@ -251,11 +251,17 @@ When validation fails:
|
||||
enabled: true,
|
||||
maxConcurrentRuns: 2,
|
||||
sessionRetention: "24h",
|
||||
runLog: {
|
||||
maxBytes: "2mb",
|
||||
keepLines: 2000,
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
See [Cron jobs](/automation/cron-jobs) for the feature overview and CLI examples.
|
||||
- `sessionRetention`: prune completed isolated run sessions from `sessions.json` (default `24h`; set `false` to disable).
|
||||
- `runLog`: prune `cron/runs/<jobId>.jsonl` by size and retained lines.
|
||||
- See [Cron jobs](/automation/cron-jobs) for feature overview and CLI examples.
|
||||
|
||||
</Accordion>
|
||||
|
||||
|
||||
@@ -65,6 +65,44 @@ OpenClaw resolves these via `src/config/sessions.ts`.
|
||||
|
||||
---
|
||||
|
||||
## Store maintenance and disk controls
|
||||
|
||||
Session persistence has automatic maintenance controls (`session.maintenance`) for `sessions.json` and transcript artifacts:
|
||||
|
||||
- `mode`: `warn` (default) or `enforce`
|
||||
- `pruneAfter`: stale-entry age cutoff (default `30d`)
|
||||
- `maxEntries`: cap entries in `sessions.json` (default `500`)
|
||||
- `rotateBytes`: rotate `sessions.json` when oversized (default `10mb`)
|
||||
- `resetArchiveRetention`: retention for `*.reset.<timestamp>` transcript archives (default: same as `pruneAfter`; `false` disables cleanup)
|
||||
- `maxDiskBytes`: optional sessions-directory budget
|
||||
- `highWaterBytes`: optional target after cleanup (default `80%` of `maxDiskBytes`)
|
||||
|
||||
Enforcement order for disk budget cleanup (`mode: "enforce"`):
|
||||
|
||||
1. Remove oldest archived or orphan transcript artifacts first.
|
||||
2. If still above the target, evict oldest session entries and their transcript files.
|
||||
3. Keep going until usage is at or below `highWaterBytes`.
|
||||
|
||||
In `mode: "warn"`, OpenClaw reports potential evictions but does not mutate the store/files.
|
||||
|
||||
Run maintenance on demand:
|
||||
|
||||
```bash
|
||||
openclaw sessions cleanup --dry-run
|
||||
openclaw sessions cleanup --enforce
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Cron sessions and run logs
|
||||
|
||||
Isolated cron runs also create session entries/transcripts, and they have dedicated retention controls:
|
||||
|
||||
- `cron.sessionRetention` (default `24h`) prunes old isolated cron run sessions from the session store (`false` disables).
|
||||
- `cron.runLog.maxBytes` + `cron.runLog.keepLines` prune `~/.openclaw/cron/runs/<jobId>.jsonl` files (defaults: `2_000_000` bytes and `2000` lines).
|
||||
|
||||
---
|
||||
|
||||
## Session keys (`sessionKey`)
|
||||
|
||||
A `sessionKey` identifies _which conversation bucket_ you’re in (routing + isolation).
|
||||
|
||||
@@ -100,9 +100,16 @@ describe("parseDurationMs", () => {
|
||||
["parses hours suffix", "2h", 7_200_000],
|
||||
["parses days suffix", "2d", 172_800_000],
|
||||
["supports decimals", "0.5s", 500],
|
||||
["parses composite hours+minutes", "1h30m", 5_400_000],
|
||||
["parses composite with milliseconds", "2m500ms", 120_500],
|
||||
] as const;
|
||||
for (const [name, input, expected] of cases) {
|
||||
expect(parseDurationMs(input), name).toBe(expected);
|
||||
}
|
||||
});
|
||||
|
||||
it("rejects invalid composite strings", () => {
|
||||
expect(() => parseDurationMs("1h30")).toThrow();
|
||||
expect(() => parseDurationMs("1h-30m")).toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -2,6 +2,14 @@ export type DurationMsParseOptions = {
|
||||
defaultUnit?: "ms" | "s" | "m" | "h" | "d";
|
||||
};
|
||||
|
||||
const DURATION_MULTIPLIERS: Record<string, number> = {
|
||||
ms: 1,
|
||||
s: 1000,
|
||||
m: 60_000,
|
||||
h: 3_600_000,
|
||||
d: 86_400_000,
|
||||
};
|
||||
|
||||
export function parseDurationMs(raw: string, opts?: DurationMsParseOptions): number {
|
||||
const trimmed = String(raw ?? "")
|
||||
.trim()
|
||||
@@ -10,28 +18,51 @@ export function parseDurationMs(raw: string, opts?: DurationMsParseOptions): num
|
||||
throw new Error("invalid duration (empty)");
|
||||
}
|
||||
|
||||
const m = /^(\d+(?:\.\d+)?)(ms|s|m|h|d)?$/.exec(trimmed);
|
||||
if (!m) {
|
||||
// Fast path for a single token (supports default unit for bare numbers).
|
||||
const single = /^(\d+(?:\.\d+)?)(ms|s|m|h|d)?$/.exec(trimmed);
|
||||
if (single) {
|
||||
const value = Number(single[1]);
|
||||
if (!Number.isFinite(value) || value < 0) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
const unit = (single[2] ?? opts?.defaultUnit ?? "ms") as "ms" | "s" | "m" | "h" | "d";
|
||||
const ms = Math.round(value * DURATION_MULTIPLIERS[unit]);
|
||||
if (!Number.isFinite(ms)) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
return ms;
|
||||
}
|
||||
|
||||
// Composite form (e.g. "1h30m", "2m500ms"); each token must include a unit.
|
||||
let totalMs = 0;
|
||||
let consumed = 0;
|
||||
const tokenRe = /(\d+(?:\.\d+)?)(ms|s|m|h|d)/g;
|
||||
for (const match of trimmed.matchAll(tokenRe)) {
|
||||
const [full, valueRaw, unitRaw] = match;
|
||||
const index = match.index ?? -1;
|
||||
if (!full || !valueRaw || !unitRaw || index < 0) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
if (index !== consumed) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
const value = Number(valueRaw);
|
||||
if (!Number.isFinite(value) || value < 0) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
const multiplier = DURATION_MULTIPLIERS[unitRaw];
|
||||
if (!multiplier) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
totalMs += value * multiplier;
|
||||
consumed += full.length;
|
||||
}
|
||||
|
||||
if (consumed !== trimmed.length || consumed === 0) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
|
||||
const value = Number(m[1]);
|
||||
if (!Number.isFinite(value) || value < 0) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
|
||||
const unit = (m[2] ?? opts?.defaultUnit ?? "ms") as "ms" | "s" | "m" | "h" | "d";
|
||||
const multiplier =
|
||||
unit === "ms"
|
||||
? 1
|
||||
: unit === "s"
|
||||
? 1000
|
||||
: unit === "m"
|
||||
? 60_000
|
||||
: unit === "h"
|
||||
? 3_600_000
|
||||
: 86_400_000;
|
||||
const ms = Math.round(value * multiplier);
|
||||
const ms = Math.round(totalMs);
|
||||
if (!Number.isFinite(ms)) {
|
||||
throw new Error(`invalid duration: ${raw}`);
|
||||
}
|
||||
|
||||
@@ -68,6 +68,7 @@ describe("command-registry", () => {
|
||||
expect(names).toContain("memory");
|
||||
expect(names).toContain("agents");
|
||||
expect(names).toContain("browser");
|
||||
expect(names).toContain("sessions");
|
||||
expect(names).not.toContain("agent");
|
||||
expect(names).not.toContain("status");
|
||||
expect(names).not.toContain("doctor");
|
||||
|
||||
@@ -181,7 +181,7 @@ const coreEntries: CoreCliEntry[] = [
|
||||
{
|
||||
name: "sessions",
|
||||
description: "List stored conversation sessions",
|
||||
hasSubcommands: false,
|
||||
hasSubcommands: true,
|
||||
},
|
||||
],
|
||||
register: async ({ program }) => {
|
||||
|
||||
@@ -4,6 +4,7 @@ import { beforeAll, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
const statusCommand = vi.fn();
|
||||
const healthCommand = vi.fn();
|
||||
const sessionsCommand = vi.fn();
|
||||
const sessionsCleanupCommand = vi.fn();
|
||||
const setVerbose = vi.fn();
|
||||
|
||||
const runtime = {
|
||||
@@ -24,6 +25,10 @@ vi.mock("../../commands/sessions.js", () => ({
|
||||
sessionsCommand,
|
||||
}));
|
||||
|
||||
vi.mock("../../commands/sessions-cleanup.js", () => ({
|
||||
sessionsCleanupCommand,
|
||||
}));
|
||||
|
||||
vi.mock("../../globals.js", () => ({
|
||||
setVerbose,
|
||||
}));
|
||||
@@ -50,6 +55,7 @@ describe("registerStatusHealthSessionsCommands", () => {
|
||||
statusCommand.mockResolvedValue(undefined);
|
||||
healthCommand.mockResolvedValue(undefined);
|
||||
sessionsCommand.mockResolvedValue(undefined);
|
||||
sessionsCleanupCommand.mockResolvedValue(undefined);
|
||||
});
|
||||
|
||||
it("runs status command with timeout and debug-derived verbose", async () => {
|
||||
@@ -133,4 +139,65 @@ describe("registerStatusHealthSessionsCommands", () => {
|
||||
runtime,
|
||||
);
|
||||
});
|
||||
|
||||
it("runs sessions command with --agent forwarding", async () => {
|
||||
await runCli(["sessions", "--agent", "work"]);
|
||||
|
||||
expect(sessionsCommand).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
agent: "work",
|
||||
allAgents: false,
|
||||
}),
|
||||
runtime,
|
||||
);
|
||||
});
|
||||
|
||||
it("runs sessions command with --all-agents forwarding", async () => {
|
||||
await runCli(["sessions", "--all-agents"]);
|
||||
|
||||
expect(sessionsCommand).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
allAgents: true,
|
||||
}),
|
||||
runtime,
|
||||
);
|
||||
});
|
||||
|
||||
it("runs sessions cleanup subcommand with forwarded options", async () => {
|
||||
await runCli([
|
||||
"sessions",
|
||||
"cleanup",
|
||||
"--store",
|
||||
"/tmp/sessions.json",
|
||||
"--dry-run",
|
||||
"--enforce",
|
||||
"--active-key",
|
||||
"agent:main:main",
|
||||
"--json",
|
||||
]);
|
||||
|
||||
expect(sessionsCleanupCommand).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
store: "/tmp/sessions.json",
|
||||
agent: undefined,
|
||||
allAgents: false,
|
||||
dryRun: true,
|
||||
enforce: true,
|
||||
activeKey: "agent:main:main",
|
||||
json: true,
|
||||
}),
|
||||
runtime,
|
||||
);
|
||||
});
|
||||
|
||||
it("forwards parent-level all-agents to cleanup subcommand", async () => {
|
||||
await runCli(["sessions", "--all-agents", "cleanup", "--dry-run"]);
|
||||
|
||||
expect(sessionsCleanupCommand).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
allAgents: true,
|
||||
}),
|
||||
runtime,
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import type { Command } from "commander";
|
||||
import { healthCommand } from "../../commands/health.js";
|
||||
import { sessionsCleanupCommand } from "../../commands/sessions-cleanup.js";
|
||||
import { sessionsCommand } from "../../commands/sessions.js";
|
||||
import { statusCommand } from "../../commands/status.js";
|
||||
import { setVerbose } from "../../globals.js";
|
||||
@@ -111,18 +112,22 @@ export function registerStatusHealthSessionsCommands(program: Command) {
|
||||
});
|
||||
});
|
||||
|
||||
program
|
||||
const sessionsCmd = program
|
||||
.command("sessions")
|
||||
.description("List stored conversation sessions")
|
||||
.option("--json", "Output as JSON", false)
|
||||
.option("--verbose", "Verbose logging", false)
|
||||
.option("--store <path>", "Path to session store (default: resolved from config)")
|
||||
.option("--agent <id>", "Agent id to inspect (default: configured default agent)")
|
||||
.option("--all-agents", "Aggregate sessions across all configured agents", false)
|
||||
.option("--active <minutes>", "Only show sessions updated within the past N minutes")
|
||||
.addHelpText(
|
||||
"after",
|
||||
() =>
|
||||
`\n${theme.heading("Examples:")}\n${formatHelpExamples([
|
||||
["openclaw sessions", "List all sessions."],
|
||||
["openclaw sessions --agent work", "List sessions for one agent."],
|
||||
["openclaw sessions --all-agents", "Aggregate sessions across agents."],
|
||||
["openclaw sessions --active 120", "Only last 2 hours."],
|
||||
["openclaw sessions --json", "Machine-readable output."],
|
||||
["openclaw sessions --store ./tmp/sessions.json", "Use a specific session store."],
|
||||
@@ -141,9 +146,61 @@ export function registerStatusHealthSessionsCommands(program: Command) {
|
||||
{
|
||||
json: Boolean(opts.json),
|
||||
store: opts.store as string | undefined,
|
||||
agent: opts.agent as string | undefined,
|
||||
allAgents: Boolean(opts.allAgents),
|
||||
active: opts.active as string | undefined,
|
||||
},
|
||||
defaultRuntime,
|
||||
);
|
||||
});
|
||||
sessionsCmd.enablePositionalOptions();
|
||||
|
||||
sessionsCmd
|
||||
.command("cleanup")
|
||||
.description("Run session-store maintenance now")
|
||||
.option("--store <path>", "Path to session store (default: resolved from config)")
|
||||
.option("--agent <id>", "Agent id to maintain (default: configured default agent)")
|
||||
.option("--all-agents", "Run maintenance across all configured agents", false)
|
||||
.option("--dry-run", "Preview maintenance actions without writing", false)
|
||||
.option("--enforce", "Apply maintenance even when configured mode is warn", false)
|
||||
.option("--active-key <key>", "Protect this session key from budget-eviction")
|
||||
.option("--json", "Output JSON", false)
|
||||
.addHelpText(
|
||||
"after",
|
||||
() =>
|
||||
`\n${theme.heading("Examples:")}\n${formatHelpExamples([
|
||||
["openclaw sessions cleanup --dry-run", "Preview stale/cap cleanup."],
|
||||
["openclaw sessions cleanup --enforce", "Apply maintenance now."],
|
||||
["openclaw sessions cleanup --agent work --dry-run", "Preview one agent store."],
|
||||
["openclaw sessions cleanup --all-agents --dry-run", "Preview all agent stores."],
|
||||
[
|
||||
"openclaw sessions cleanup --enforce --store ./tmp/sessions.json",
|
||||
"Use a specific store.",
|
||||
],
|
||||
])}`,
|
||||
)
|
||||
.action(async (opts, command) => {
|
||||
const parentOpts = command.parent?.opts() as
|
||||
| {
|
||||
store?: string;
|
||||
agent?: string;
|
||||
allAgents?: boolean;
|
||||
json?: boolean;
|
||||
}
|
||||
| undefined;
|
||||
await runCommandWithRuntime(defaultRuntime, async () => {
|
||||
await sessionsCleanupCommand(
|
||||
{
|
||||
store: (opts.store as string | undefined) ?? parentOpts?.store,
|
||||
agent: (opts.agent as string | undefined) ?? parentOpts?.agent,
|
||||
allAgents: Boolean(opts.allAgents || parentOpts?.allAgents),
|
||||
dryRun: Boolean(opts.dryRun),
|
||||
enforce: Boolean(opts.enforce),
|
||||
activeKey: opts.activeKey as string | undefined,
|
||||
json: Boolean(opts.json || parentOpts?.json),
|
||||
},
|
||||
defaultRuntime,
|
||||
);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
@@ -30,6 +30,14 @@ describe("program routes", () => {
|
||||
await expectRunFalse(["sessions"], ["node", "openclaw", "sessions", "--active"]);
|
||||
});
|
||||
|
||||
it("returns false for sessions route when --agent value is missing", async () => {
|
||||
await expectRunFalse(["sessions"], ["node", "openclaw", "sessions", "--agent"]);
|
||||
});
|
||||
|
||||
it("does not fast-route sessions subcommands", () => {
|
||||
expect(findRoutedCommand(["sessions", "cleanup"])).toBeNull();
|
||||
});
|
||||
|
||||
it("does not match unknown routes", () => {
|
||||
expect(findRoutedCommand(["definitely-not-real"])).toBeNull();
|
||||
});
|
||||
|
||||
@@ -43,9 +43,16 @@ const routeStatus: RouteSpec = {
|
||||
};
|
||||
|
||||
const routeSessions: RouteSpec = {
|
||||
match: (path) => path[0] === "sessions",
|
||||
// Fast-path only bare `sessions`; subcommands (e.g. `sessions cleanup`)
|
||||
// must fall through to Commander so nested handlers run.
|
||||
match: (path) => path[0] === "sessions" && !path[1],
|
||||
run: async (argv) => {
|
||||
const json = hasFlag(argv, "--json");
|
||||
const allAgents = hasFlag(argv, "--all-agents");
|
||||
const agent = getFlagValue(argv, "--agent");
|
||||
if (agent === null) {
|
||||
return false;
|
||||
}
|
||||
const store = getFlagValue(argv, "--store");
|
||||
if (store === null) {
|
||||
return false;
|
||||
@@ -55,7 +62,7 @@ const routeSessions: RouteSpec = {
|
||||
return false;
|
||||
}
|
||||
const { sessionsCommand } = await import("../../commands/sessions.js");
|
||||
await sessionsCommand({ json, store, active }, defaultRuntime);
|
||||
await sessionsCommand({ json, store, agent, allAgents, active }, defaultRuntime);
|
||||
return true;
|
||||
},
|
||||
};
|
||||
|
||||
@@ -124,4 +124,51 @@ describe("doctor state integrity oauth dir checks", () => {
|
||||
expect(confirmSkipInNonInteractive).toHaveBeenCalledWith(OAUTH_PROMPT_MATCHER);
|
||||
expect(stateIntegrityText()).toContain("CRITICAL: OAuth dir missing");
|
||||
});
|
||||
|
||||
it("detects orphan transcripts and offers archival remediation", async () => {
|
||||
const cfg: OpenClawConfig = {};
|
||||
setupSessionState(cfg, process.env, process.env.HOME ?? "");
|
||||
const sessionsDir = resolveSessionTranscriptsDirForAgent("main", process.env, () => tempHome);
|
||||
fs.writeFileSync(path.join(sessionsDir, "orphan-session.jsonl"), '{"type":"session"}\n');
|
||||
const confirmSkipInNonInteractive = vi.fn(async (params: { message: string }) =>
|
||||
params.message.includes("orphan transcript file"),
|
||||
);
|
||||
await noteStateIntegrity(cfg, { confirmSkipInNonInteractive });
|
||||
expect(stateIntegrityText()).toContain("orphan transcript file");
|
||||
expect(confirmSkipInNonInteractive).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
message: expect.stringContaining("orphan transcript file"),
|
||||
}),
|
||||
);
|
||||
const files = fs.readdirSync(sessionsDir);
|
||||
expect(files.some((name) => name.startsWith("orphan-session.jsonl.deleted."))).toBe(true);
|
||||
});
|
||||
|
||||
it("prints openclaw-only verification hints when recent sessions are missing transcripts", async () => {
|
||||
const cfg: OpenClawConfig = {};
|
||||
setupSessionState(cfg, process.env, process.env.HOME ?? "");
|
||||
const storePath = resolveStorePath(cfg.session?.store, { agentId: "main" });
|
||||
fs.writeFileSync(
|
||||
storePath,
|
||||
JSON.stringify(
|
||||
{
|
||||
"agent:main:main": {
|
||||
sessionId: "missing-transcript",
|
||||
updatedAt: Date.now(),
|
||||
},
|
||||
},
|
||||
null,
|
||||
2,
|
||||
),
|
||||
);
|
||||
|
||||
await noteStateIntegrity(cfg, { confirmSkipInNonInteractive: vi.fn(async () => false) });
|
||||
|
||||
const text = stateIntegrityText();
|
||||
expect(text).toContain("recent sessions are missing transcripts");
|
||||
expect(text).toMatch(/openclaw sessions --store ".*sessions\.json"/);
|
||||
expect(text).toMatch(/openclaw sessions cleanup --store ".*sessions\.json" --dry-run/);
|
||||
expect(text).not.toContain("--active");
|
||||
expect(text).not.toContain(" ls ");
|
||||
});
|
||||
});
|
||||
|
||||
@@ -2,9 +2,12 @@ import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { resolveDefaultAgentId } from "../agents/agent-scope.js";
|
||||
import { formatCliCommand } from "../cli/command-format.js";
|
||||
import type { OpenClawConfig } from "../config/config.js";
|
||||
import { resolveOAuthDir, resolveStateDir } from "../config/paths.js";
|
||||
import {
|
||||
formatSessionArchiveTimestamp,
|
||||
isPrimarySessionTranscriptFileName,
|
||||
loadSessionStore,
|
||||
resolveMainSessionKey,
|
||||
resolveSessionFilePath,
|
||||
@@ -202,6 +205,7 @@ export async function noteStateIntegrity(
|
||||
const sessionsDir = resolveSessionTranscriptsDirForAgent(agentId, env, homedir);
|
||||
const storePath = resolveStorePath(cfg.session?.store, { agentId });
|
||||
const storeDir = path.dirname(storePath);
|
||||
const absoluteStorePath = path.resolve(storePath);
|
||||
const displayStateDir = shortenHomePath(stateDir);
|
||||
const displayOauthDir = shortenHomePath(oauthDir);
|
||||
const displaySessionsDir = shortenHomePath(sessionsDir);
|
||||
@@ -408,7 +412,11 @@ export async function noteStateIntegrity(
|
||||
});
|
||||
if (missing.length > 0) {
|
||||
warnings.push(
|
||||
`- ${missing.length}/${recent.length} recent sessions are missing transcripts. Check for deleted session files or split state dirs.`,
|
||||
[
|
||||
`- ${missing.length}/${recent.length} recent sessions are missing transcripts.`,
|
||||
` Verify sessions in store: ${formatCliCommand(`openclaw sessions --store "${absoluteStorePath}"`)}`,
|
||||
` Preview cleanup impact: ${formatCliCommand(`openclaw sessions cleanup --store "${absoluteStorePath}" --dry-run`)}`,
|
||||
].join("\n"),
|
||||
);
|
||||
}
|
||||
|
||||
@@ -435,6 +443,54 @@ export async function noteStateIntegrity(
|
||||
}
|
||||
}
|
||||
|
||||
if (existsDir(sessionsDir)) {
|
||||
const referencedTranscriptPaths = new Set<string>();
|
||||
for (const [, entry] of entries) {
|
||||
if (!entry?.sessionId) {
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
referencedTranscriptPaths.add(
|
||||
path.resolve(resolveSessionFilePath(entry.sessionId, entry, sessionPathOpts)),
|
||||
);
|
||||
} catch {
|
||||
// ignore invalid legacy paths
|
||||
}
|
||||
}
|
||||
const sessionDirEntries = fs.readdirSync(sessionsDir, { withFileTypes: true });
|
||||
const orphanTranscriptPaths = sessionDirEntries
|
||||
.filter((entry) => entry.isFile() && isPrimarySessionTranscriptFileName(entry.name))
|
||||
.map((entry) => path.resolve(path.join(sessionsDir, entry.name)))
|
||||
.filter((filePath) => !referencedTranscriptPaths.has(filePath));
|
||||
if (orphanTranscriptPaths.length > 0) {
|
||||
warnings.push(
|
||||
`- Found ${orphanTranscriptPaths.length} orphan transcript file(s) in ${displaySessionsDir}. They are not referenced by sessions.json and can consume disk over time.`,
|
||||
);
|
||||
const archiveOrphans = await prompter.confirmSkipInNonInteractive({
|
||||
message: `Archive ${orphanTranscriptPaths.length} orphan transcript file(s) in ${displaySessionsDir}?`,
|
||||
initialValue: false,
|
||||
});
|
||||
if (archiveOrphans) {
|
||||
let archived = 0;
|
||||
const archivedAt = formatSessionArchiveTimestamp();
|
||||
for (const orphanPath of orphanTranscriptPaths) {
|
||||
const archivedPath = `${orphanPath}.deleted.${archivedAt}`;
|
||||
try {
|
||||
fs.renameSync(orphanPath, archivedPath);
|
||||
archived += 1;
|
||||
} catch (err) {
|
||||
warnings.push(
|
||||
`- Failed to archive orphan transcript ${shortenHomePath(orphanPath)}: ${String(err)}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
if (archived > 0) {
|
||||
changes.push(`- Archived ${archived} orphan transcript file(s) in ${displaySessionsDir}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (warnings.length > 0) {
|
||||
note(warnings.join("\n"), "State integrity");
|
||||
}
|
||||
|
||||
79
src/commands/session-store-targets.test.ts
Normal file
79
src/commands/session-store-targets.test.ts
Normal file
@@ -0,0 +1,79 @@
|
||||
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import { resolveSessionStoreTargets } from "./session-store-targets.js";
|
||||
|
||||
const resolveStorePathMock = vi.hoisted(() => vi.fn());
|
||||
const resolveDefaultAgentIdMock = vi.hoisted(() => vi.fn());
|
||||
const listAgentIdsMock = vi.hoisted(() => vi.fn());
|
||||
|
||||
vi.mock("../config/sessions.js", () => ({
|
||||
resolveStorePath: resolveStorePathMock,
|
||||
}));
|
||||
|
||||
vi.mock("../agents/agent-scope.js", () => ({
|
||||
resolveDefaultAgentId: resolveDefaultAgentIdMock,
|
||||
listAgentIds: listAgentIdsMock,
|
||||
}));
|
||||
|
||||
describe("resolveSessionStoreTargets", () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks();
|
||||
});
|
||||
|
||||
it("resolves the default agent store when no selector is provided", () => {
|
||||
resolveDefaultAgentIdMock.mockReturnValue("main");
|
||||
resolveStorePathMock.mockReturnValue("/tmp/main-sessions.json");
|
||||
|
||||
const targets = resolveSessionStoreTargets({}, {});
|
||||
|
||||
expect(targets).toEqual([{ agentId: "main", storePath: "/tmp/main-sessions.json" }]);
|
||||
expect(resolveStorePathMock).toHaveBeenCalledWith(undefined, { agentId: "main" });
|
||||
});
|
||||
|
||||
it("resolves all configured agent stores", () => {
|
||||
listAgentIdsMock.mockReturnValue(["main", "work"]);
|
||||
resolveStorePathMock
|
||||
.mockReturnValueOnce("/tmp/main-sessions.json")
|
||||
.mockReturnValueOnce("/tmp/work-sessions.json");
|
||||
|
||||
const targets = resolveSessionStoreTargets(
|
||||
{
|
||||
session: { store: "~/.openclaw/agents/{agentId}/sessions/sessions.json" },
|
||||
},
|
||||
{ allAgents: true },
|
||||
);
|
||||
|
||||
expect(targets).toEqual([
|
||||
{ agentId: "main", storePath: "/tmp/main-sessions.json" },
|
||||
{ agentId: "work", storePath: "/tmp/work-sessions.json" },
|
||||
]);
|
||||
});
|
||||
|
||||
it("dedupes shared store paths for --all-agents", () => {
|
||||
listAgentIdsMock.mockReturnValue(["main", "work"]);
|
||||
resolveStorePathMock.mockReturnValue("/tmp/shared-sessions.json");
|
||||
|
||||
const targets = resolveSessionStoreTargets(
|
||||
{
|
||||
session: { store: "/tmp/shared-sessions.json" },
|
||||
},
|
||||
{ allAgents: true },
|
||||
);
|
||||
|
||||
expect(targets).toEqual([{ agentId: "main", storePath: "/tmp/shared-sessions.json" }]);
|
||||
expect(resolveStorePathMock).toHaveBeenCalledTimes(2);
|
||||
});
|
||||
|
||||
it("rejects unknown agent ids", () => {
|
||||
listAgentIdsMock.mockReturnValue(["main", "work"]);
|
||||
expect(() => resolveSessionStoreTargets({}, { agent: "ghost" })).toThrow(/Unknown agent id/);
|
||||
});
|
||||
|
||||
it("rejects conflicting selectors", () => {
|
||||
expect(() => resolveSessionStoreTargets({}, { agent: "main", allAgents: true })).toThrow(
|
||||
/cannot be used together/i,
|
||||
);
|
||||
expect(() =>
|
||||
resolveSessionStoreTargets({}, { store: "/tmp/sessions.json", allAgents: true }),
|
||||
).toThrow(/cannot be combined/i);
|
||||
});
|
||||
});
|
||||
80
src/commands/session-store-targets.ts
Normal file
80
src/commands/session-store-targets.ts
Normal file
@@ -0,0 +1,80 @@
|
||||
import { listAgentIds, resolveDefaultAgentId } from "../agents/agent-scope.js";
|
||||
import { resolveStorePath } from "../config/sessions.js";
|
||||
import type { OpenClawConfig } from "../config/types.openclaw.js";
|
||||
import { normalizeAgentId } from "../routing/session-key.js";
|
||||
|
||||
export type SessionStoreSelectionOptions = {
|
||||
store?: string;
|
||||
agent?: string;
|
||||
allAgents?: boolean;
|
||||
};
|
||||
|
||||
export type SessionStoreTarget = {
|
||||
agentId: string;
|
||||
storePath: string;
|
||||
};
|
||||
|
||||
function dedupeTargetsByStorePath(targets: SessionStoreTarget[]): SessionStoreTarget[] {
|
||||
const deduped = new Map<string, SessionStoreTarget>();
|
||||
for (const target of targets) {
|
||||
if (!deduped.has(target.storePath)) {
|
||||
deduped.set(target.storePath, target);
|
||||
}
|
||||
}
|
||||
return [...deduped.values()];
|
||||
}
|
||||
|
||||
export function resolveSessionStoreTargets(
|
||||
cfg: OpenClawConfig,
|
||||
opts: SessionStoreSelectionOptions,
|
||||
): SessionStoreTarget[] {
|
||||
const defaultAgentId = resolveDefaultAgentId(cfg);
|
||||
const hasAgent = Boolean(opts.agent?.trim());
|
||||
const allAgents = opts.allAgents === true;
|
||||
if (hasAgent && allAgents) {
|
||||
throw new Error("--agent and --all-agents cannot be used together");
|
||||
}
|
||||
if (opts.store && (hasAgent || allAgents)) {
|
||||
throw new Error("--store cannot be combined with --agent or --all-agents");
|
||||
}
|
||||
|
||||
if (opts.store) {
|
||||
return [
|
||||
{
|
||||
agentId: defaultAgentId,
|
||||
storePath: resolveStorePath(opts.store, { agentId: defaultAgentId }),
|
||||
},
|
||||
];
|
||||
}
|
||||
|
||||
if (allAgents) {
|
||||
const targets = listAgentIds(cfg).map((agentId) => ({
|
||||
agentId,
|
||||
storePath: resolveStorePath(cfg.session?.store, { agentId }),
|
||||
}));
|
||||
return dedupeTargetsByStorePath(targets);
|
||||
}
|
||||
|
||||
if (hasAgent) {
|
||||
const knownAgents = listAgentIds(cfg);
|
||||
const requested = normalizeAgentId(opts.agent ?? "");
|
||||
if (!knownAgents.includes(requested)) {
|
||||
throw new Error(
|
||||
`Unknown agent id "${opts.agent}". Use "openclaw agents list" to see configured agents.`,
|
||||
);
|
||||
}
|
||||
return [
|
||||
{
|
||||
agentId: requested,
|
||||
storePath: resolveStorePath(cfg.session?.store, { agentId: requested }),
|
||||
},
|
||||
];
|
||||
}
|
||||
|
||||
return [
|
||||
{
|
||||
agentId: defaultAgentId,
|
||||
storePath: resolveStorePath(cfg.session?.store, { agentId: defaultAgentId }),
|
||||
},
|
||||
];
|
||||
}
|
||||
246
src/commands/sessions-cleanup.test.ts
Normal file
246
src/commands/sessions-cleanup.test.ts
Normal file
@@ -0,0 +1,246 @@
|
||||
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import type { SessionEntry } from "../config/sessions.js";
|
||||
import type { RuntimeEnv } from "../runtime.js";
|
||||
|
||||
const mocks = vi.hoisted(() => ({
|
||||
loadConfig: vi.fn(),
|
||||
resolveSessionStoreTargets: vi.fn(),
|
||||
resolveMaintenanceConfig: vi.fn(),
|
||||
loadSessionStore: vi.fn(),
|
||||
pruneStaleEntries: vi.fn(),
|
||||
capEntryCount: vi.fn(),
|
||||
updateSessionStore: vi.fn(),
|
||||
enforceSessionDiskBudget: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock("../config/config.js", () => ({
|
||||
loadConfig: mocks.loadConfig,
|
||||
}));
|
||||
|
||||
vi.mock("./session-store-targets.js", () => ({
|
||||
resolveSessionStoreTargets: mocks.resolveSessionStoreTargets,
|
||||
}));
|
||||
|
||||
vi.mock("../config/sessions.js", () => ({
|
||||
resolveMaintenanceConfig: mocks.resolveMaintenanceConfig,
|
||||
loadSessionStore: mocks.loadSessionStore,
|
||||
pruneStaleEntries: mocks.pruneStaleEntries,
|
||||
capEntryCount: mocks.capEntryCount,
|
||||
updateSessionStore: mocks.updateSessionStore,
|
||||
enforceSessionDiskBudget: mocks.enforceSessionDiskBudget,
|
||||
}));
|
||||
|
||||
import { sessionsCleanupCommand } from "./sessions-cleanup.js";
|
||||
|
||||
function makeRuntime(): { runtime: RuntimeEnv; logs: string[] } {
|
||||
const logs: string[] = [];
|
||||
return {
|
||||
runtime: {
|
||||
log: (msg: unknown) => logs.push(String(msg)),
|
||||
error: () => {},
|
||||
exit: () => {},
|
||||
},
|
||||
logs,
|
||||
};
|
||||
}
|
||||
|
||||
describe("sessionsCleanupCommand", () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks();
|
||||
mocks.loadConfig.mockReturnValue({ session: { store: "/cfg/sessions.json" } });
|
||||
mocks.resolveSessionStoreTargets.mockReturnValue([
|
||||
{ agentId: "main", storePath: "/resolved/sessions.json" },
|
||||
]);
|
||||
mocks.resolveMaintenanceConfig.mockReturnValue({
|
||||
mode: "warn",
|
||||
pruneAfterMs: 7 * 24 * 60 * 60 * 1000,
|
||||
maxEntries: 500,
|
||||
rotateBytes: 10_485_760,
|
||||
resetArchiveRetentionMs: 7 * 24 * 60 * 60 * 1000,
|
||||
maxDiskBytes: null,
|
||||
highWaterBytes: null,
|
||||
});
|
||||
mocks.pruneStaleEntries.mockImplementation(
|
||||
(
|
||||
store: Record<string, SessionEntry>,
|
||||
_maxAgeMs: number,
|
||||
opts?: { onPruned?: (params: { key: string; entry: SessionEntry }) => void },
|
||||
) => {
|
||||
if (store.stale) {
|
||||
opts?.onPruned?.({ key: "stale", entry: store.stale });
|
||||
delete store.stale;
|
||||
return 1;
|
||||
}
|
||||
return 0;
|
||||
},
|
||||
);
|
||||
mocks.capEntryCount.mockImplementation(() => 0);
|
||||
mocks.updateSessionStore.mockResolvedValue(undefined);
|
||||
mocks.enforceSessionDiskBudget.mockResolvedValue({
|
||||
totalBytesBefore: 1000,
|
||||
totalBytesAfter: 700,
|
||||
removedFiles: 1,
|
||||
removedEntries: 1,
|
||||
freedBytes: 300,
|
||||
maxBytes: 900,
|
||||
highWaterBytes: 700,
|
||||
overBudget: true,
|
||||
});
|
||||
});
|
||||
|
||||
it("emits a single JSON object for non-dry runs and applies maintenance", async () => {
|
||||
mocks.loadSessionStore
|
||||
.mockReturnValueOnce({
|
||||
stale: { sessionId: "stale", updatedAt: 1 },
|
||||
fresh: { sessionId: "fresh", updatedAt: 2 },
|
||||
})
|
||||
.mockReturnValueOnce({
|
||||
fresh: { sessionId: "fresh", updatedAt: 2 },
|
||||
});
|
||||
mocks.updateSessionStore.mockImplementation(
|
||||
async (
|
||||
_storePath: string,
|
||||
mutator: (store: Record<string, SessionEntry>) => Promise<void> | void,
|
||||
opts?: {
|
||||
onMaintenanceApplied?: (report: {
|
||||
mode: "warn" | "enforce";
|
||||
beforeCount: number;
|
||||
afterCount: number;
|
||||
pruned: number;
|
||||
capped: number;
|
||||
diskBudget: Record<string, unknown> | null;
|
||||
}) => Promise<void> | void;
|
||||
},
|
||||
) => {
|
||||
await mutator({});
|
||||
await opts?.onMaintenanceApplied?.({
|
||||
mode: "enforce",
|
||||
beforeCount: 3,
|
||||
afterCount: 1,
|
||||
pruned: 0,
|
||||
capped: 2,
|
||||
diskBudget: {
|
||||
totalBytesBefore: 1200,
|
||||
totalBytesAfter: 800,
|
||||
removedFiles: 0,
|
||||
removedEntries: 0,
|
||||
freedBytes: 400,
|
||||
maxBytes: 1000,
|
||||
highWaterBytes: 800,
|
||||
overBudget: true,
|
||||
},
|
||||
});
|
||||
},
|
||||
);
|
||||
|
||||
const { runtime, logs } = makeRuntime();
|
||||
await sessionsCleanupCommand(
|
||||
{
|
||||
json: true,
|
||||
enforce: true,
|
||||
activeKey: "agent:main:main",
|
||||
},
|
||||
runtime,
|
||||
);
|
||||
|
||||
expect(logs).toHaveLength(1);
|
||||
const payload = JSON.parse(logs[0] ?? "{}") as Record<string, unknown>;
|
||||
expect(payload.applied).toBe(true);
|
||||
expect(payload.mode).toBe("enforce");
|
||||
expect(payload.beforeCount).toBe(3);
|
||||
expect(payload.appliedCount).toBe(1);
|
||||
expect(payload.pruned).toBe(0);
|
||||
expect(payload.capped).toBe(2);
|
||||
expect(payload.diskBudget).toEqual(
|
||||
expect.objectContaining({
|
||||
removedFiles: 0,
|
||||
removedEntries: 0,
|
||||
}),
|
||||
);
|
||||
expect(mocks.updateSessionStore).toHaveBeenCalledWith(
|
||||
"/resolved/sessions.json",
|
||||
expect.any(Function),
|
||||
expect.objectContaining({
|
||||
activeSessionKey: "agent:main:main",
|
||||
maintenanceOverride: { mode: "enforce" },
|
||||
onMaintenanceApplied: expect.any(Function),
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it("returns dry-run JSON without mutating the store", async () => {
|
||||
mocks.loadSessionStore.mockReturnValue({
|
||||
stale: { sessionId: "stale", updatedAt: 1 },
|
||||
fresh: { sessionId: "fresh", updatedAt: 2 },
|
||||
});
|
||||
|
||||
const { runtime, logs } = makeRuntime();
|
||||
await sessionsCleanupCommand(
|
||||
{
|
||||
json: true,
|
||||
dryRun: true,
|
||||
},
|
||||
runtime,
|
||||
);
|
||||
|
||||
expect(logs).toHaveLength(1);
|
||||
const payload = JSON.parse(logs[0] ?? "{}") as Record<string, unknown>;
|
||||
expect(payload.dryRun).toBe(true);
|
||||
expect(payload.applied).toBeUndefined();
|
||||
expect(mocks.updateSessionStore).not.toHaveBeenCalled();
|
||||
expect(payload.diskBudget).toEqual(
|
||||
expect.objectContaining({
|
||||
removedFiles: 1,
|
||||
removedEntries: 1,
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it("renders a dry-run action table with keep/prune actions", async () => {
|
||||
mocks.enforceSessionDiskBudget.mockResolvedValue(null);
|
||||
mocks.loadSessionStore.mockReturnValue({
|
||||
stale: { sessionId: "stale", updatedAt: 1, model: "pi:opus" },
|
||||
fresh: { sessionId: "fresh", updatedAt: 2, model: "pi:opus" },
|
||||
});
|
||||
|
||||
const { runtime, logs } = makeRuntime();
|
||||
await sessionsCleanupCommand(
|
||||
{
|
||||
dryRun: true,
|
||||
},
|
||||
runtime,
|
||||
);
|
||||
|
||||
expect(logs.some((line) => line.includes("Planned session actions:"))).toBe(true);
|
||||
expect(logs.some((line) => line.includes("Action") && line.includes("Key"))).toBe(true);
|
||||
expect(logs.some((line) => line.includes("fresh") && line.includes("keep"))).toBe(true);
|
||||
expect(logs.some((line) => line.includes("stale") && line.includes("prune-stale"))).toBe(true);
|
||||
});
|
||||
|
||||
it("returns grouped JSON for --all-agents dry-runs", async () => {
|
||||
mocks.resolveSessionStoreTargets.mockReturnValue([
|
||||
{ agentId: "main", storePath: "/resolved/main-sessions.json" },
|
||||
{ agentId: "work", storePath: "/resolved/work-sessions.json" },
|
||||
]);
|
||||
mocks.enforceSessionDiskBudget.mockResolvedValue(null);
|
||||
mocks.loadSessionStore
|
||||
.mockReturnValueOnce({ stale: { sessionId: "stale-main", updatedAt: 1 } })
|
||||
.mockReturnValueOnce({ stale: { sessionId: "stale-work", updatedAt: 1 } });
|
||||
|
||||
const { runtime, logs } = makeRuntime();
|
||||
await sessionsCleanupCommand(
|
||||
{
|
||||
json: true,
|
||||
dryRun: true,
|
||||
allAgents: true,
|
||||
},
|
||||
runtime,
|
||||
);
|
||||
|
||||
expect(logs).toHaveLength(1);
|
||||
const payload = JSON.parse(logs[0] ?? "{}") as Record<string, unknown>;
|
||||
expect(payload.allAgents).toBe(true);
|
||||
expect(Array.isArray(payload.stores)).toBe(true);
|
||||
expect((payload.stores as unknown[]).length).toBe(2);
|
||||
});
|
||||
});
|
||||
397
src/commands/sessions-cleanup.ts
Normal file
397
src/commands/sessions-cleanup.ts
Normal file
@@ -0,0 +1,397 @@
|
||||
import { loadConfig } from "../config/config.js";
|
||||
import {
|
||||
capEntryCount,
|
||||
enforceSessionDiskBudget,
|
||||
loadSessionStore,
|
||||
pruneStaleEntries,
|
||||
resolveMaintenanceConfig,
|
||||
updateSessionStore,
|
||||
type SessionEntry,
|
||||
type SessionMaintenanceApplyReport,
|
||||
} from "../config/sessions.js";
|
||||
import type { RuntimeEnv } from "../runtime.js";
|
||||
import { isRich, theme } from "../terminal/theme.js";
|
||||
import { resolveSessionStoreTargets, type SessionStoreTarget } from "./session-store-targets.js";
|
||||
import {
|
||||
formatSessionAgeCell,
|
||||
formatSessionFlagsCell,
|
||||
formatSessionKeyCell,
|
||||
formatSessionModelCell,
|
||||
resolveSessionDisplayDefaults,
|
||||
resolveSessionDisplayModel,
|
||||
SESSION_AGE_PAD,
|
||||
SESSION_KEY_PAD,
|
||||
SESSION_MODEL_PAD,
|
||||
toSessionDisplayRows,
|
||||
} from "./sessions-table.js";
|
||||
|
||||
export type SessionsCleanupOptions = {
|
||||
store?: string;
|
||||
agent?: string;
|
||||
allAgents?: boolean;
|
||||
dryRun?: boolean;
|
||||
enforce?: boolean;
|
||||
activeKey?: string;
|
||||
json?: boolean;
|
||||
};
|
||||
|
||||
type SessionCleanupAction = "keep" | "prune-stale" | "cap-overflow" | "evict-budget";
|
||||
|
||||
const ACTION_PAD = 12;
|
||||
|
||||
type SessionCleanupActionRow = ReturnType<typeof toSessionDisplayRows>[number] & {
|
||||
action: SessionCleanupAction;
|
||||
};
|
||||
|
||||
type SessionCleanupSummary = {
|
||||
agentId: string;
|
||||
storePath: string;
|
||||
mode: "warn" | "enforce";
|
||||
dryRun: boolean;
|
||||
beforeCount: number;
|
||||
afterCount: number;
|
||||
pruned: number;
|
||||
capped: number;
|
||||
diskBudget: Awaited<ReturnType<typeof enforceSessionDiskBudget>>;
|
||||
wouldMutate: boolean;
|
||||
applied?: true;
|
||||
appliedCount?: number;
|
||||
};
|
||||
|
||||
function resolveSessionCleanupAction(params: {
|
||||
key: string;
|
||||
staleKeys: Set<string>;
|
||||
cappedKeys: Set<string>;
|
||||
budgetEvictedKeys: Set<string>;
|
||||
}): SessionCleanupAction {
|
||||
if (params.staleKeys.has(params.key)) {
|
||||
return "prune-stale";
|
||||
}
|
||||
if (params.cappedKeys.has(params.key)) {
|
||||
return "cap-overflow";
|
||||
}
|
||||
if (params.budgetEvictedKeys.has(params.key)) {
|
||||
return "evict-budget";
|
||||
}
|
||||
return "keep";
|
||||
}
|
||||
|
||||
function formatCleanupActionCell(action: SessionCleanupAction, rich: boolean): string {
|
||||
const label = action.padEnd(ACTION_PAD);
|
||||
if (!rich) {
|
||||
return label;
|
||||
}
|
||||
if (action === "keep") {
|
||||
return theme.muted(label);
|
||||
}
|
||||
if (action === "prune-stale") {
|
||||
return theme.warn(label);
|
||||
}
|
||||
if (action === "cap-overflow") {
|
||||
return theme.accentBright(label);
|
||||
}
|
||||
return theme.error(label);
|
||||
}
|
||||
|
||||
function buildActionRows(params: {
|
||||
beforeStore: Record<string, SessionEntry>;
|
||||
staleKeys: Set<string>;
|
||||
cappedKeys: Set<string>;
|
||||
budgetEvictedKeys: Set<string>;
|
||||
}): SessionCleanupActionRow[] {
|
||||
return toSessionDisplayRows(params.beforeStore).map((row) => ({
|
||||
...row,
|
||||
action: resolveSessionCleanupAction({
|
||||
key: row.key,
|
||||
staleKeys: params.staleKeys,
|
||||
cappedKeys: params.cappedKeys,
|
||||
budgetEvictedKeys: params.budgetEvictedKeys,
|
||||
}),
|
||||
}));
|
||||
}
|
||||
|
||||
async function previewStoreCleanup(params: {
|
||||
target: SessionStoreTarget;
|
||||
mode: "warn" | "enforce";
|
||||
dryRun: boolean;
|
||||
activeKey?: string;
|
||||
}) {
|
||||
const maintenance = resolveMaintenanceConfig();
|
||||
const beforeStore = loadSessionStore(params.target.storePath, { skipCache: true });
|
||||
const previewStore = structuredClone(beforeStore);
|
||||
const staleKeys = new Set<string>();
|
||||
const cappedKeys = new Set<string>();
|
||||
const pruned = pruneStaleEntries(previewStore, maintenance.pruneAfterMs, {
|
||||
log: false,
|
||||
onPruned: ({ key }) => {
|
||||
staleKeys.add(key);
|
||||
},
|
||||
});
|
||||
const capped = capEntryCount(previewStore, maintenance.maxEntries, {
|
||||
log: false,
|
||||
onCapped: ({ key }) => {
|
||||
cappedKeys.add(key);
|
||||
},
|
||||
});
|
||||
const beforeBudgetStore = structuredClone(previewStore);
|
||||
const diskBudget = await enforceSessionDiskBudget({
|
||||
store: previewStore,
|
||||
storePath: params.target.storePath,
|
||||
activeSessionKey: params.activeKey,
|
||||
maintenance,
|
||||
warnOnly: false,
|
||||
dryRun: true,
|
||||
});
|
||||
const budgetEvictedKeys = new Set<string>();
|
||||
for (const key of Object.keys(beforeBudgetStore)) {
|
||||
if (!Object.hasOwn(previewStore, key)) {
|
||||
budgetEvictedKeys.add(key);
|
||||
}
|
||||
}
|
||||
const beforeCount = Object.keys(beforeStore).length;
|
||||
const afterPreviewCount = Object.keys(previewStore).length;
|
||||
const wouldMutate =
|
||||
pruned > 0 ||
|
||||
capped > 0 ||
|
||||
Boolean((diskBudget?.removedEntries ?? 0) > 0 || (diskBudget?.removedFiles ?? 0) > 0);
|
||||
|
||||
const summary: SessionCleanupSummary = {
|
||||
agentId: params.target.agentId,
|
||||
storePath: params.target.storePath,
|
||||
mode: params.mode,
|
||||
dryRun: params.dryRun,
|
||||
beforeCount,
|
||||
afterCount: afterPreviewCount,
|
||||
pruned,
|
||||
capped,
|
||||
diskBudget,
|
||||
wouldMutate,
|
||||
};
|
||||
|
||||
return {
|
||||
summary,
|
||||
actionRows: buildActionRows({
|
||||
beforeStore,
|
||||
staleKeys,
|
||||
cappedKeys,
|
||||
budgetEvictedKeys,
|
||||
}),
|
||||
};
|
||||
}
|
||||
|
||||
function renderStoreDryRunPlan(params: {
|
||||
cfg: ReturnType<typeof loadConfig>;
|
||||
summary: SessionCleanupSummary;
|
||||
actionRows: SessionCleanupActionRow[];
|
||||
displayDefaults: ReturnType<typeof resolveSessionDisplayDefaults>;
|
||||
runtime: RuntimeEnv;
|
||||
showAgentHeader: boolean;
|
||||
}) {
|
||||
const rich = isRich();
|
||||
if (params.showAgentHeader) {
|
||||
params.runtime.log(`Agent: ${params.summary.agentId}`);
|
||||
}
|
||||
params.runtime.log(`Session store: ${params.summary.storePath}`);
|
||||
params.runtime.log(`Maintenance mode: ${params.summary.mode}`);
|
||||
params.runtime.log(
|
||||
`Entries: ${params.summary.beforeCount} -> ${params.summary.afterCount} (remove ${params.summary.beforeCount - params.summary.afterCount})`,
|
||||
);
|
||||
params.runtime.log(`Would prune stale: ${params.summary.pruned}`);
|
||||
params.runtime.log(`Would cap overflow: ${params.summary.capped}`);
|
||||
if (params.summary.diskBudget) {
|
||||
params.runtime.log(
|
||||
`Would enforce disk budget: ${params.summary.diskBudget.totalBytesBefore} -> ${params.summary.diskBudget.totalBytesAfter} bytes (files ${params.summary.diskBudget.removedFiles}, entries ${params.summary.diskBudget.removedEntries})`,
|
||||
);
|
||||
}
|
||||
if (params.actionRows.length === 0) {
|
||||
return;
|
||||
}
|
||||
params.runtime.log("");
|
||||
params.runtime.log("Planned session actions:");
|
||||
const header = [
|
||||
"Action".padEnd(ACTION_PAD),
|
||||
"Key".padEnd(SESSION_KEY_PAD),
|
||||
"Age".padEnd(SESSION_AGE_PAD),
|
||||
"Model".padEnd(SESSION_MODEL_PAD),
|
||||
"Flags",
|
||||
].join(" ");
|
||||
params.runtime.log(rich ? theme.heading(header) : header);
|
||||
for (const actionRow of params.actionRows) {
|
||||
const model = resolveSessionDisplayModel(params.cfg, actionRow, params.displayDefaults);
|
||||
const line = [
|
||||
formatCleanupActionCell(actionRow.action, rich),
|
||||
formatSessionKeyCell(actionRow.key, rich),
|
||||
formatSessionAgeCell(actionRow.updatedAt, rich),
|
||||
formatSessionModelCell(model, rich),
|
||||
formatSessionFlagsCell(actionRow, rich),
|
||||
].join(" ");
|
||||
params.runtime.log(line.trimEnd());
|
||||
}
|
||||
}
|
||||
|
||||
export async function sessionsCleanupCommand(opts: SessionsCleanupOptions, runtime: RuntimeEnv) {
|
||||
const cfg = loadConfig();
|
||||
const displayDefaults = resolveSessionDisplayDefaults(cfg);
|
||||
const mode = opts.enforce ? "enforce" : resolveMaintenanceConfig().mode;
|
||||
let targets: SessionStoreTarget[];
|
||||
try {
|
||||
targets = resolveSessionStoreTargets(cfg, {
|
||||
store: opts.store,
|
||||
agent: opts.agent,
|
||||
allAgents: opts.allAgents,
|
||||
});
|
||||
} catch (error) {
|
||||
runtime.error(error instanceof Error ? error.message : String(error));
|
||||
runtime.exit(1);
|
||||
return;
|
||||
}
|
||||
|
||||
const previewResults: Array<{
|
||||
summary: SessionCleanupSummary;
|
||||
actionRows: SessionCleanupActionRow[];
|
||||
}> = [];
|
||||
for (const target of targets) {
|
||||
const result = await previewStoreCleanup({
|
||||
target,
|
||||
mode,
|
||||
dryRun: Boolean(opts.dryRun),
|
||||
activeKey: opts.activeKey,
|
||||
});
|
||||
previewResults.push(result);
|
||||
}
|
||||
|
||||
if (opts.dryRun) {
|
||||
if (opts.json) {
|
||||
if (previewResults.length === 1) {
|
||||
runtime.log(JSON.stringify(previewResults[0]?.summary ?? {}, null, 2));
|
||||
return;
|
||||
}
|
||||
runtime.log(
|
||||
JSON.stringify(
|
||||
{
|
||||
allAgents: true,
|
||||
mode,
|
||||
dryRun: true,
|
||||
stores: previewResults.map((result) => result.summary),
|
||||
},
|
||||
null,
|
||||
2,
|
||||
),
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
for (let i = 0; i < previewResults.length; i += 1) {
|
||||
const result = previewResults[i];
|
||||
if (i > 0) {
|
||||
runtime.log("");
|
||||
}
|
||||
renderStoreDryRunPlan({
|
||||
cfg,
|
||||
summary: result.summary,
|
||||
actionRows: result.actionRows,
|
||||
displayDefaults,
|
||||
runtime,
|
||||
showAgentHeader: previewResults.length > 1,
|
||||
});
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
const appliedSummaries: SessionCleanupSummary[] = [];
|
||||
for (const target of targets) {
|
||||
const appliedReportRef: { current: SessionMaintenanceApplyReport | null } = {
|
||||
current: null,
|
||||
};
|
||||
await updateSessionStore(
|
||||
target.storePath,
|
||||
async () => {
|
||||
// Maintenance runs in saveSessionStoreUnlocked(); no direct store mutation needed here.
|
||||
},
|
||||
{
|
||||
activeSessionKey: opts.activeKey,
|
||||
maintenanceOverride: {
|
||||
mode,
|
||||
},
|
||||
onMaintenanceApplied: (report) => {
|
||||
appliedReportRef.current = report;
|
||||
},
|
||||
},
|
||||
);
|
||||
const afterStore = loadSessionStore(target.storePath, { skipCache: true });
|
||||
const preview = previewResults.find((result) => result.summary.storePath === target.storePath);
|
||||
const appliedReport = appliedReportRef.current;
|
||||
const summary: SessionCleanupSummary =
|
||||
appliedReport === null
|
||||
? {
|
||||
...(preview?.summary ?? {
|
||||
agentId: target.agentId,
|
||||
storePath: target.storePath,
|
||||
mode,
|
||||
dryRun: false,
|
||||
beforeCount: 0,
|
||||
afterCount: 0,
|
||||
pruned: 0,
|
||||
capped: 0,
|
||||
diskBudget: null,
|
||||
wouldMutate: false,
|
||||
}),
|
||||
dryRun: false,
|
||||
applied: true,
|
||||
appliedCount: Object.keys(afterStore).length,
|
||||
}
|
||||
: {
|
||||
agentId: target.agentId,
|
||||
storePath: target.storePath,
|
||||
mode: appliedReport.mode,
|
||||
dryRun: false,
|
||||
beforeCount: appliedReport.beforeCount,
|
||||
afterCount: appliedReport.afterCount,
|
||||
pruned: appliedReport.pruned,
|
||||
capped: appliedReport.capped,
|
||||
diskBudget: appliedReport.diskBudget,
|
||||
wouldMutate:
|
||||
appliedReport.pruned > 0 ||
|
||||
appliedReport.capped > 0 ||
|
||||
Boolean(
|
||||
(appliedReport.diskBudget?.removedEntries ?? 0) > 0 ||
|
||||
(appliedReport.diskBudget?.removedFiles ?? 0) > 0,
|
||||
),
|
||||
applied: true,
|
||||
appliedCount: Object.keys(afterStore).length,
|
||||
};
|
||||
appliedSummaries.push(summary);
|
||||
}
|
||||
|
||||
if (opts.json) {
|
||||
if (appliedSummaries.length === 1) {
|
||||
runtime.log(JSON.stringify(appliedSummaries[0] ?? {}, null, 2));
|
||||
return;
|
||||
}
|
||||
runtime.log(
|
||||
JSON.stringify(
|
||||
{
|
||||
allAgents: true,
|
||||
mode,
|
||||
dryRun: false,
|
||||
stores: appliedSummaries,
|
||||
},
|
||||
null,
|
||||
2,
|
||||
),
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
for (let i = 0; i < appliedSummaries.length; i += 1) {
|
||||
const summary = appliedSummaries[i];
|
||||
if (i > 0) {
|
||||
runtime.log("");
|
||||
}
|
||||
if (appliedSummaries.length > 1) {
|
||||
runtime.log(`Agent: ${summary.agentId}`);
|
||||
}
|
||||
runtime.log(`Session store: ${summary.storePath}`);
|
||||
runtime.log(`Applied maintenance. Current entries: ${summary.appliedCount ?? 0}`);
|
||||
}
|
||||
}
|
||||
148
src/commands/sessions-table.ts
Normal file
148
src/commands/sessions-table.ts
Normal file
@@ -0,0 +1,148 @@
|
||||
import { DEFAULT_MODEL, DEFAULT_PROVIDER } from "../agents/defaults.js";
|
||||
import { resolveConfiguredModelRef } from "../agents/model-selection.js";
|
||||
import type { SessionEntry } from "../config/sessions.js";
|
||||
import type { OpenClawConfig } from "../config/types.openclaw.js";
|
||||
import { resolveSessionModelRef } from "../gateway/session-utils.js";
|
||||
import { formatTimeAgo } from "../infra/format-time/format-relative.ts";
|
||||
import { parseAgentSessionKey } from "../routing/session-key.js";
|
||||
import { theme } from "../terminal/theme.js";
|
||||
|
||||
export type SessionDisplayRow = {
|
||||
key: string;
|
||||
updatedAt: number | null;
|
||||
ageMs: number | null;
|
||||
sessionId?: string;
|
||||
systemSent?: boolean;
|
||||
abortedLastRun?: boolean;
|
||||
thinkingLevel?: string;
|
||||
verboseLevel?: string;
|
||||
reasoningLevel?: string;
|
||||
elevatedLevel?: string;
|
||||
responseUsage?: string;
|
||||
groupActivation?: string;
|
||||
inputTokens?: number;
|
||||
outputTokens?: number;
|
||||
totalTokens?: number;
|
||||
totalTokensFresh?: boolean;
|
||||
model?: string;
|
||||
modelProvider?: string;
|
||||
providerOverride?: string;
|
||||
modelOverride?: string;
|
||||
contextTokens?: number;
|
||||
};
|
||||
|
||||
export type SessionDisplayDefaults = {
|
||||
model: string;
|
||||
};
|
||||
|
||||
export const SESSION_KEY_PAD = 26;
|
||||
export const SESSION_AGE_PAD = 9;
|
||||
export const SESSION_MODEL_PAD = 14;
|
||||
|
||||
export function toSessionDisplayRows(store: Record<string, SessionEntry>): SessionDisplayRow[] {
|
||||
return Object.entries(store)
|
||||
.map(([key, entry]) => {
|
||||
const updatedAt = entry?.updatedAt ?? null;
|
||||
return {
|
||||
key,
|
||||
updatedAt,
|
||||
ageMs: updatedAt ? Date.now() - updatedAt : null,
|
||||
sessionId: entry?.sessionId,
|
||||
systemSent: entry?.systemSent,
|
||||
abortedLastRun: entry?.abortedLastRun,
|
||||
thinkingLevel: entry?.thinkingLevel,
|
||||
verboseLevel: entry?.verboseLevel,
|
||||
reasoningLevel: entry?.reasoningLevel,
|
||||
elevatedLevel: entry?.elevatedLevel,
|
||||
responseUsage: entry?.responseUsage,
|
||||
groupActivation: entry?.groupActivation,
|
||||
inputTokens: entry?.inputTokens,
|
||||
outputTokens: entry?.outputTokens,
|
||||
totalTokens: entry?.totalTokens,
|
||||
totalTokensFresh: entry?.totalTokensFresh,
|
||||
model: entry?.model,
|
||||
modelProvider: entry?.modelProvider,
|
||||
providerOverride: entry?.providerOverride,
|
||||
modelOverride: entry?.modelOverride,
|
||||
contextTokens: entry?.contextTokens,
|
||||
} satisfies SessionDisplayRow;
|
||||
})
|
||||
.toSorted((a, b) => (b.updatedAt ?? 0) - (a.updatedAt ?? 0));
|
||||
}
|
||||
|
||||
export function resolveSessionDisplayDefaults(cfg: OpenClawConfig): SessionDisplayDefaults {
|
||||
const resolved = resolveConfiguredModelRef({
|
||||
cfg,
|
||||
defaultProvider: DEFAULT_PROVIDER,
|
||||
defaultModel: DEFAULT_MODEL,
|
||||
});
|
||||
return {
|
||||
model: resolved.model ?? DEFAULT_MODEL,
|
||||
};
|
||||
}
|
||||
|
||||
export function resolveSessionDisplayModel(
|
||||
cfg: OpenClawConfig,
|
||||
row: Pick<
|
||||
SessionDisplayRow,
|
||||
"key" | "model" | "modelProvider" | "modelOverride" | "providerOverride"
|
||||
>,
|
||||
defaults: SessionDisplayDefaults,
|
||||
): string {
|
||||
const resolved = resolveSessionModelRef(cfg, row, parseAgentSessionKey(row.key)?.agentId);
|
||||
return resolved.model ?? defaults.model;
|
||||
}
|
||||
|
||||
function truncateSessionKey(key: string): string {
|
||||
if (key.length <= SESSION_KEY_PAD) {
|
||||
return key;
|
||||
}
|
||||
const head = Math.max(4, SESSION_KEY_PAD - 10);
|
||||
return `${key.slice(0, head)}...${key.slice(-6)}`;
|
||||
}
|
||||
|
||||
export function formatSessionKeyCell(key: string, rich: boolean): string {
|
||||
const label = truncateSessionKey(key).padEnd(SESSION_KEY_PAD);
|
||||
return rich ? theme.accent(label) : label;
|
||||
}
|
||||
|
||||
export function formatSessionAgeCell(updatedAt: number | null | undefined, rich: boolean): string {
|
||||
const ageLabel = updatedAt ? formatTimeAgo(Date.now() - updatedAt) : "unknown";
|
||||
const padded = ageLabel.padEnd(SESSION_AGE_PAD);
|
||||
return rich ? theme.muted(padded) : padded;
|
||||
}
|
||||
|
||||
export function formatSessionModelCell(model: string | null | undefined, rich: boolean): string {
|
||||
const label = (model ?? "unknown").padEnd(SESSION_MODEL_PAD);
|
||||
return rich ? theme.info(label) : label;
|
||||
}
|
||||
|
||||
export function formatSessionFlagsCell(
|
||||
row: Pick<
|
||||
SessionDisplayRow,
|
||||
| "thinkingLevel"
|
||||
| "verboseLevel"
|
||||
| "reasoningLevel"
|
||||
| "elevatedLevel"
|
||||
| "responseUsage"
|
||||
| "groupActivation"
|
||||
| "systemSent"
|
||||
| "abortedLastRun"
|
||||
| "sessionId"
|
||||
>,
|
||||
rich: boolean,
|
||||
): string {
|
||||
const flags = [
|
||||
row.thinkingLevel ? `think:${row.thinkingLevel}` : null,
|
||||
row.verboseLevel ? `verbose:${row.verboseLevel}` : null,
|
||||
row.reasoningLevel ? `reasoning:${row.reasoningLevel}` : null,
|
||||
row.elevatedLevel ? `elev:${row.elevatedLevel}` : null,
|
||||
row.responseUsage ? `usage:${row.responseUsage}` : null,
|
||||
row.groupActivation ? `activation:${row.groupActivation}` : null,
|
||||
row.systemSent ? "system" : null,
|
||||
row.abortedLastRun ? "aborted" : null,
|
||||
row.sessionId ? `id:${row.sessionId}` : null,
|
||||
].filter(Boolean);
|
||||
const label = flags.join(" ");
|
||||
return label.length === 0 ? "" : rich ? theme.muted(label) : label;
|
||||
}
|
||||
@@ -1,4 +1,4 @@
|
||||
import { describe, expect, it, vi } from "vitest";
|
||||
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||
import type { RuntimeEnv } from "../runtime.js";
|
||||
|
||||
const loadConfigMock = vi.hoisted(() =>
|
||||
@@ -25,6 +25,7 @@ const resolveStorePathMock = vi.hoisted(() =>
|
||||
return `/tmp/sessions-${opts?.agentId ?? "missing"}.json`;
|
||||
}),
|
||||
);
|
||||
const loadSessionStoreMock = vi.hoisted(() => vi.fn(() => ({})));
|
||||
|
||||
vi.mock("../config/config.js", async (importOriginal) => {
|
||||
const actual = await importOriginal<typeof import("../config/config.js")>();
|
||||
@@ -39,7 +40,7 @@ vi.mock("../config/sessions.js", async (importOriginal) => {
|
||||
return {
|
||||
...actual,
|
||||
resolveStorePath: resolveStorePathMock,
|
||||
loadSessionStore: vi.fn(() => ({})),
|
||||
loadSessionStore: loadSessionStoreMock,
|
||||
};
|
||||
});
|
||||
|
||||
@@ -58,6 +59,67 @@ function createRuntime(): { runtime: RuntimeEnv; logs: string[] } {
|
||||
}
|
||||
|
||||
describe("sessionsCommand default store agent selection", () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks();
|
||||
resolveStorePathMock.mockImplementation(
|
||||
(_store: string | undefined, opts?: { agentId?: string }) => {
|
||||
return `/tmp/sessions-${opts?.agentId ?? "missing"}.json`;
|
||||
},
|
||||
);
|
||||
loadSessionStoreMock.mockImplementation(() => ({}));
|
||||
});
|
||||
|
||||
it("includes agentId on sessions rows for --all-agents JSON output", async () => {
|
||||
resolveStorePathMock.mockClear();
|
||||
loadSessionStoreMock.mockReset();
|
||||
loadSessionStoreMock
|
||||
.mockReturnValueOnce({
|
||||
main_row: { sessionId: "s1", updatedAt: Date.now() - 60_000, model: "pi:opus" },
|
||||
})
|
||||
.mockReturnValueOnce({
|
||||
voice_row: { sessionId: "s2", updatedAt: Date.now() - 120_000, model: "pi:opus" },
|
||||
});
|
||||
const { runtime, logs } = createRuntime();
|
||||
|
||||
await sessionsCommand({ allAgents: true, json: true }, runtime);
|
||||
|
||||
const payload = JSON.parse(logs[0] ?? "{}") as {
|
||||
allAgents?: boolean;
|
||||
sessions?: Array<{ key: string; agentId?: string }>;
|
||||
};
|
||||
expect(payload.allAgents).toBe(true);
|
||||
expect(payload.sessions?.map((session) => session.agentId)).toContain("main");
|
||||
expect(payload.sessions?.map((session) => session.agentId)).toContain("voice");
|
||||
});
|
||||
|
||||
it("avoids duplicate rows when --all-agents resolves to a shared store path", async () => {
|
||||
resolveStorePathMock.mockReset();
|
||||
resolveStorePathMock.mockReturnValue("/tmp/shared-sessions.json");
|
||||
loadSessionStoreMock.mockReset();
|
||||
loadSessionStoreMock.mockReturnValue({
|
||||
"agent:main:room": { sessionId: "s1", updatedAt: Date.now() - 60_000, model: "pi:opus" },
|
||||
"agent:voice:room": { sessionId: "s2", updatedAt: Date.now() - 30_000, model: "pi:opus" },
|
||||
});
|
||||
const { runtime, logs } = createRuntime();
|
||||
|
||||
await sessionsCommand({ allAgents: true, json: true }, runtime);
|
||||
|
||||
const payload = JSON.parse(logs[0] ?? "{}") as {
|
||||
count?: number;
|
||||
stores?: Array<{ agentId: string; path: string }>;
|
||||
allAgents?: boolean;
|
||||
sessions?: Array<{ key: string; agentId?: string }>;
|
||||
};
|
||||
expect(payload.count).toBe(2);
|
||||
expect(payload.allAgents).toBe(true);
|
||||
expect(payload.stores).toEqual([{ agentId: "main", path: "/tmp/shared-sessions.json" }]);
|
||||
expect(payload.sessions?.map((session) => session.agentId).toSorted()).toEqual([
|
||||
"main",
|
||||
"voice",
|
||||
]);
|
||||
expect(loadSessionStoreMock).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("uses configured default agent id when resolving implicit session store path", async () => {
|
||||
resolveStorePathMock.mockClear();
|
||||
const { runtime, logs } = createRuntime();
|
||||
@@ -69,4 +131,26 @@ describe("sessionsCommand default store agent selection", () => {
|
||||
});
|
||||
expect(logs[0]).toContain("Session store: /tmp/sessions-voice.json");
|
||||
});
|
||||
|
||||
it("uses all configured agent stores with --all-agents", async () => {
|
||||
resolveStorePathMock.mockClear();
|
||||
loadSessionStoreMock.mockReset();
|
||||
loadSessionStoreMock
|
||||
.mockReturnValueOnce({
|
||||
main_row: { sessionId: "s1", updatedAt: Date.now() - 60_000, model: "pi:opus" },
|
||||
})
|
||||
.mockReturnValueOnce({});
|
||||
const { runtime, logs } = createRuntime();
|
||||
|
||||
await sessionsCommand({ allAgents: true }, runtime);
|
||||
|
||||
expect(resolveStorePathMock).toHaveBeenCalledWith("/tmp/sessions-{agentId}.json", {
|
||||
agentId: "main",
|
||||
});
|
||||
expect(resolveStorePathMock).toHaveBeenCalledWith("/tmp/sessions-{agentId}.json", {
|
||||
agentId: "voice",
|
||||
});
|
||||
expect(logs[0]).toContain("Session stores: 2 (main, voice)");
|
||||
expect(logs[2]).toContain("Agent");
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,62 +1,38 @@
|
||||
import { resolveDefaultAgentId } from "../agents/agent-scope.js";
|
||||
import { lookupContextTokens } from "../agents/context.js";
|
||||
import { DEFAULT_CONTEXT_TOKENS, DEFAULT_MODEL, DEFAULT_PROVIDER } from "../agents/defaults.js";
|
||||
import { resolveConfiguredModelRef } from "../agents/model-selection.js";
|
||||
import { DEFAULT_CONTEXT_TOKENS } from "../agents/defaults.js";
|
||||
import { loadConfig } from "../config/config.js";
|
||||
import {
|
||||
loadSessionStore,
|
||||
resolveFreshSessionTotalTokens,
|
||||
resolveStorePath,
|
||||
type SessionEntry,
|
||||
} from "../config/sessions.js";
|
||||
import { classifySessionKey, resolveSessionModelRef } from "../gateway/session-utils.js";
|
||||
import { loadSessionStore, resolveFreshSessionTotalTokens } from "../config/sessions.js";
|
||||
import { classifySessionKey } from "../gateway/session-utils.js";
|
||||
import { info } from "../globals.js";
|
||||
import { formatTimeAgo } from "../infra/format-time/format-relative.ts";
|
||||
import { parseAgentSessionKey } from "../routing/session-key.js";
|
||||
import type { RuntimeEnv } from "../runtime.js";
|
||||
import { isRich, theme } from "../terminal/theme.js";
|
||||
import { resolveSessionStoreTargets } from "./session-store-targets.js";
|
||||
import {
|
||||
formatSessionAgeCell,
|
||||
formatSessionFlagsCell,
|
||||
formatSessionKeyCell,
|
||||
formatSessionModelCell,
|
||||
resolveSessionDisplayDefaults,
|
||||
resolveSessionDisplayModel,
|
||||
SESSION_AGE_PAD,
|
||||
SESSION_KEY_PAD,
|
||||
SESSION_MODEL_PAD,
|
||||
type SessionDisplayRow,
|
||||
toSessionDisplayRows,
|
||||
} from "./sessions-table.js";
|
||||
|
||||
type SessionRow = {
|
||||
key: string;
|
||||
type SessionRow = SessionDisplayRow & {
|
||||
agentId: string;
|
||||
kind: "direct" | "group" | "global" | "unknown";
|
||||
updatedAt: number | null;
|
||||
ageMs: number | null;
|
||||
sessionId?: string;
|
||||
systemSent?: boolean;
|
||||
abortedLastRun?: boolean;
|
||||
thinkingLevel?: string;
|
||||
verboseLevel?: string;
|
||||
reasoningLevel?: string;
|
||||
elevatedLevel?: string;
|
||||
responseUsage?: string;
|
||||
groupActivation?: string;
|
||||
inputTokens?: number;
|
||||
outputTokens?: number;
|
||||
totalTokens?: number;
|
||||
totalTokensFresh?: boolean;
|
||||
model?: string;
|
||||
modelProvider?: string;
|
||||
providerOverride?: string;
|
||||
modelOverride?: string;
|
||||
contextTokens?: number;
|
||||
};
|
||||
|
||||
const AGENT_PAD = 10;
|
||||
const KIND_PAD = 6;
|
||||
const KEY_PAD = 26;
|
||||
const AGE_PAD = 9;
|
||||
const MODEL_PAD = 14;
|
||||
const TOKENS_PAD = 20;
|
||||
|
||||
const formatKTokens = (value: number) => `${(value / 1000).toFixed(value >= 10_000 ? 0 : 1)}k`;
|
||||
|
||||
const truncateKey = (key: string) => {
|
||||
if (key.length <= KEY_PAD) {
|
||||
return key;
|
||||
}
|
||||
const head = Math.max(4, KEY_PAD - 10);
|
||||
return `${key.slice(0, head)}...${key.slice(-6)}`;
|
||||
};
|
||||
|
||||
const colorByPct = (label: string, pct: number | null, rich: boolean) => {
|
||||
if (!rich || pct === null) {
|
||||
return label;
|
||||
@@ -108,83 +84,29 @@ const formatKindCell = (kind: SessionRow["kind"], rich: boolean) => {
|
||||
return theme.muted(label);
|
||||
};
|
||||
|
||||
const formatAgeCell = (updatedAt: number | null | undefined, rich: boolean) => {
|
||||
const ageLabel = updatedAt ? formatTimeAgo(Date.now() - updatedAt) : "unknown";
|
||||
const padded = ageLabel.padEnd(AGE_PAD);
|
||||
return rich ? theme.muted(padded) : padded;
|
||||
};
|
||||
|
||||
const formatModelCell = (model: string | null | undefined, rich: boolean) => {
|
||||
const label = (model ?? "unknown").padEnd(MODEL_PAD);
|
||||
return rich ? theme.info(label) : label;
|
||||
};
|
||||
|
||||
const formatFlagsCell = (row: SessionRow, rich: boolean) => {
|
||||
const flags = [
|
||||
row.thinkingLevel ? `think:${row.thinkingLevel}` : null,
|
||||
row.verboseLevel ? `verbose:${row.verboseLevel}` : null,
|
||||
row.reasoningLevel ? `reasoning:${row.reasoningLevel}` : null,
|
||||
row.elevatedLevel ? `elev:${row.elevatedLevel}` : null,
|
||||
row.responseUsage ? `usage:${row.responseUsage}` : null,
|
||||
row.groupActivation ? `activation:${row.groupActivation}` : null,
|
||||
row.systemSent ? "system" : null,
|
||||
row.abortedLastRun ? "aborted" : null,
|
||||
row.sessionId ? `id:${row.sessionId}` : null,
|
||||
].filter(Boolean);
|
||||
const label = flags.join(" ");
|
||||
return label.length === 0 ? "" : rich ? theme.muted(label) : label;
|
||||
};
|
||||
|
||||
function toRows(store: Record<string, SessionEntry>): SessionRow[] {
|
||||
return Object.entries(store)
|
||||
.map(([key, entry]) => {
|
||||
const updatedAt = entry?.updatedAt ?? null;
|
||||
return {
|
||||
key,
|
||||
kind: classifySessionKey(key, entry),
|
||||
updatedAt,
|
||||
ageMs: updatedAt ? Date.now() - updatedAt : null,
|
||||
sessionId: entry?.sessionId,
|
||||
systemSent: entry?.systemSent,
|
||||
abortedLastRun: entry?.abortedLastRun,
|
||||
thinkingLevel: entry?.thinkingLevel,
|
||||
verboseLevel: entry?.verboseLevel,
|
||||
reasoningLevel: entry?.reasoningLevel,
|
||||
elevatedLevel: entry?.elevatedLevel,
|
||||
responseUsage: entry?.responseUsage,
|
||||
groupActivation: entry?.groupActivation,
|
||||
inputTokens: entry?.inputTokens,
|
||||
outputTokens: entry?.outputTokens,
|
||||
totalTokens: entry?.totalTokens,
|
||||
totalTokensFresh: entry?.totalTokensFresh,
|
||||
model: entry?.model,
|
||||
modelProvider: entry?.modelProvider,
|
||||
providerOverride: entry?.providerOverride,
|
||||
modelOverride: entry?.modelOverride,
|
||||
contextTokens: entry?.contextTokens,
|
||||
} satisfies SessionRow;
|
||||
})
|
||||
.toSorted((a, b) => (b.updatedAt ?? 0) - (a.updatedAt ?? 0));
|
||||
}
|
||||
|
||||
export async function sessionsCommand(
|
||||
opts: { json?: boolean; store?: string; active?: string },
|
||||
opts: { json?: boolean; store?: string; active?: string; agent?: string; allAgents?: boolean },
|
||||
runtime: RuntimeEnv,
|
||||
) {
|
||||
const aggregateAgents = opts.allAgents === true;
|
||||
const cfg = loadConfig();
|
||||
const resolved = resolveConfiguredModelRef({
|
||||
cfg,
|
||||
defaultProvider: DEFAULT_PROVIDER,
|
||||
defaultModel: DEFAULT_MODEL,
|
||||
});
|
||||
const displayDefaults = resolveSessionDisplayDefaults(cfg);
|
||||
const configContextTokens =
|
||||
cfg.agents?.defaults?.contextTokens ??
|
||||
lookupContextTokens(resolved.model) ??
|
||||
lookupContextTokens(displayDefaults.model) ??
|
||||
DEFAULT_CONTEXT_TOKENS;
|
||||
const configModel = resolved.model ?? DEFAULT_MODEL;
|
||||
const defaultAgentId = resolveDefaultAgentId(cfg);
|
||||
const storePath = resolveStorePath(opts.store ?? cfg.session?.store, { agentId: defaultAgentId });
|
||||
const store = loadSessionStore(storePath);
|
||||
let targets: ReturnType<typeof resolveSessionStoreTargets>;
|
||||
try {
|
||||
targets = resolveSessionStoreTargets(cfg, {
|
||||
store: opts.store,
|
||||
agent: opts.agent,
|
||||
allAgents: opts.allAgents,
|
||||
});
|
||||
} catch (error) {
|
||||
runtime.error(error instanceof Error ? error.message : String(error));
|
||||
runtime.exit(1);
|
||||
return;
|
||||
}
|
||||
|
||||
let activeMinutes: number | undefined;
|
||||
if (opts.active !== undefined) {
|
||||
@@ -197,30 +119,44 @@ export async function sessionsCommand(
|
||||
activeMinutes = parsed;
|
||||
}
|
||||
|
||||
const rows = toRows(store).filter((row) => {
|
||||
if (activeMinutes === undefined) {
|
||||
return true;
|
||||
}
|
||||
if (!row.updatedAt) {
|
||||
return false;
|
||||
}
|
||||
return Date.now() - row.updatedAt <= activeMinutes * 60_000;
|
||||
});
|
||||
const rows = targets
|
||||
.flatMap((target) => {
|
||||
const store = loadSessionStore(target.storePath);
|
||||
return toSessionDisplayRows(store).map((row) => ({
|
||||
...row,
|
||||
agentId: parseAgentSessionKey(row.key)?.agentId ?? target.agentId,
|
||||
kind: classifySessionKey(row.key, store[row.key]),
|
||||
}));
|
||||
})
|
||||
.filter((row) => {
|
||||
if (activeMinutes === undefined) {
|
||||
return true;
|
||||
}
|
||||
if (!row.updatedAt) {
|
||||
return false;
|
||||
}
|
||||
return Date.now() - row.updatedAt <= activeMinutes * 60_000;
|
||||
})
|
||||
.toSorted((a, b) => (b.updatedAt ?? 0) - (a.updatedAt ?? 0));
|
||||
|
||||
if (opts.json) {
|
||||
const multi = targets.length > 1;
|
||||
const aggregate = aggregateAgents || multi;
|
||||
runtime.log(
|
||||
JSON.stringify(
|
||||
{
|
||||
path: storePath,
|
||||
path: aggregate ? null : (targets[0]?.storePath ?? null),
|
||||
stores: aggregate
|
||||
? targets.map((target) => ({
|
||||
agentId: target.agentId,
|
||||
path: target.storePath,
|
||||
}))
|
||||
: undefined,
|
||||
allAgents: aggregateAgents ? true : undefined,
|
||||
count: rows.length,
|
||||
activeMinutes: activeMinutes ?? null,
|
||||
sessions: rows.map((r) => {
|
||||
const resolvedModel = resolveSessionModelRef(
|
||||
cfg,
|
||||
r,
|
||||
parseAgentSessionKey(r.key)?.agentId,
|
||||
);
|
||||
const model = resolvedModel.model ?? configModel;
|
||||
const model = resolveSessionDisplayModel(cfg, r, displayDefaults);
|
||||
return {
|
||||
...r,
|
||||
totalTokens: resolveFreshSessionTotalTokens(r) ?? null,
|
||||
@@ -239,7 +175,13 @@ export async function sessionsCommand(
|
||||
return;
|
||||
}
|
||||
|
||||
runtime.log(info(`Session store: ${storePath}`));
|
||||
if (targets.length === 1 && !aggregateAgents) {
|
||||
runtime.log(info(`Session store: ${targets[0]?.storePath}`));
|
||||
} else {
|
||||
runtime.log(
|
||||
info(`Session stores: ${targets.length} (${targets.map((t) => t.agentId).join(", ")})`),
|
||||
);
|
||||
}
|
||||
runtime.log(info(`Sessions listed: ${rows.length}`));
|
||||
if (activeMinutes) {
|
||||
runtime.log(info(`Filtered to last ${activeMinutes} minute(s)`));
|
||||
@@ -250,11 +192,13 @@ export async function sessionsCommand(
|
||||
}
|
||||
|
||||
const rich = isRich();
|
||||
const showAgentColumn = aggregateAgents || targets.length > 1;
|
||||
const header = [
|
||||
...(showAgentColumn ? ["Agent".padEnd(AGENT_PAD)] : []),
|
||||
"Kind".padEnd(KIND_PAD),
|
||||
"Key".padEnd(KEY_PAD),
|
||||
"Age".padEnd(AGE_PAD),
|
||||
"Model".padEnd(MODEL_PAD),
|
||||
"Key".padEnd(SESSION_KEY_PAD),
|
||||
"Age".padEnd(SESSION_AGE_PAD),
|
||||
"Model".padEnd(SESSION_MODEL_PAD),
|
||||
"Tokens (ctx %)".padEnd(TOKENS_PAD),
|
||||
"Flags",
|
||||
].join(" ");
|
||||
@@ -262,21 +206,20 @@ export async function sessionsCommand(
|
||||
runtime.log(rich ? theme.heading(header) : header);
|
||||
|
||||
for (const row of rows) {
|
||||
const resolvedModel = resolveSessionModelRef(cfg, row, parseAgentSessionKey(row.key)?.agentId);
|
||||
const model = resolvedModel.model ?? configModel;
|
||||
const model = resolveSessionDisplayModel(cfg, row, displayDefaults);
|
||||
const contextTokens = row.contextTokens ?? lookupContextTokens(model) ?? configContextTokens;
|
||||
const total = resolveFreshSessionTotalTokens(row);
|
||||
|
||||
const keyLabel = truncateKey(row.key).padEnd(KEY_PAD);
|
||||
const keyCell = rich ? theme.accent(keyLabel) : keyLabel;
|
||||
|
||||
const line = [
|
||||
...(showAgentColumn
|
||||
? [rich ? theme.accentBright(row.agentId.padEnd(AGENT_PAD)) : row.agentId.padEnd(AGENT_PAD)]
|
||||
: []),
|
||||
formatKindCell(row.kind, rich),
|
||||
keyCell,
|
||||
formatAgeCell(row.updatedAt, rich),
|
||||
formatModelCell(model, rich),
|
||||
formatSessionKeyCell(row.key, rich),
|
||||
formatSessionAgeCell(row.updatedAt, rich),
|
||||
formatSessionModelCell(model, rich),
|
||||
formatTokensCell(total, contextTokens ?? null, rich),
|
||||
formatFlagsCell(row, rich),
|
||||
formatSessionFlagsCell(row, rich),
|
||||
].join(" ");
|
||||
|
||||
runtime.log(line.trimEnd());
|
||||
|
||||
@@ -110,6 +110,9 @@ const TARGET_KEYS = [
|
||||
"cron.webhook",
|
||||
"cron.webhookToken",
|
||||
"cron.sessionRetention",
|
||||
"cron.runLog",
|
||||
"cron.runLog.maxBytes",
|
||||
"cron.runLog.keepLines",
|
||||
"session",
|
||||
"session.scope",
|
||||
"session.dmScope",
|
||||
@@ -150,6 +153,9 @@ const TARGET_KEYS = [
|
||||
"session.maintenance.pruneDays",
|
||||
"session.maintenance.maxEntries",
|
||||
"session.maintenance.rotateBytes",
|
||||
"session.maintenance.resetArchiveRetention",
|
||||
"session.maintenance.maxDiskBytes",
|
||||
"session.maintenance.highWaterBytes",
|
||||
"approvals",
|
||||
"approvals.exec",
|
||||
"approvals.exec.enabled",
|
||||
@@ -663,6 +669,27 @@ describe("config help copy quality", () => {
|
||||
const deprecated = FIELD_HELP["session.maintenance.pruneDays"];
|
||||
expect(/deprecated/i.test(deprecated)).toBe(true);
|
||||
expect(deprecated.includes("session.maintenance.pruneAfter")).toBe(true);
|
||||
|
||||
const resetRetention = FIELD_HELP["session.maintenance.resetArchiveRetention"];
|
||||
expect(resetRetention.includes(".reset.")).toBe(true);
|
||||
expect(/false/i.test(resetRetention)).toBe(true);
|
||||
|
||||
const maxDisk = FIELD_HELP["session.maintenance.maxDiskBytes"];
|
||||
expect(maxDisk.includes("500mb")).toBe(true);
|
||||
|
||||
const highWater = FIELD_HELP["session.maintenance.highWaterBytes"];
|
||||
expect(highWater.includes("80%")).toBe(true);
|
||||
});
|
||||
|
||||
it("documents cron run-log retention controls", () => {
|
||||
const runLog = FIELD_HELP["cron.runLog"];
|
||||
expect(runLog.includes("cron/runs")).toBe(true);
|
||||
|
||||
const maxBytes = FIELD_HELP["cron.runLog.maxBytes"];
|
||||
expect(maxBytes.includes("2mb")).toBe(true);
|
||||
|
||||
const keepLines = FIELD_HELP["cron.runLog.keepLines"];
|
||||
expect(keepLines.includes("2000")).toBe(true);
|
||||
});
|
||||
|
||||
it("documents approvals filters and target semantics", () => {
|
||||
|
||||
@@ -999,6 +999,12 @@ export const FIELD_HELP: Record<string, string> = {
|
||||
"Caps total session entry count retained in the store to prevent unbounded growth over time. Use lower limits for constrained environments, or higher limits when longer history is required.",
|
||||
"session.maintenance.rotateBytes":
|
||||
"Rotates the session store when file size exceeds a threshold such as `10mb` or `1gb`. Use this to bound single-file growth and keep backup/restore operations manageable.",
|
||||
"session.maintenance.resetArchiveRetention":
|
||||
"Retention for reset transcript archives (`*.reset.<timestamp>`). Accepts a duration (for example `30d`), or `false` to disable cleanup. Defaults to pruneAfter so reset artifacts do not grow forever.",
|
||||
"session.maintenance.maxDiskBytes":
|
||||
"Optional per-agent sessions-directory disk budget (for example `500mb`). Use this to cap session storage per agent; when exceeded, warn mode reports pressure and enforce mode performs oldest-first cleanup.",
|
||||
"session.maintenance.highWaterBytes":
|
||||
"Target size after disk-budget cleanup (high-water mark). Defaults to 80% of maxDiskBytes; set explicitly for tighter reclaim behavior on constrained disks.",
|
||||
cron: "Global scheduler settings for stored cron jobs, run concurrency, delivery fallback, and run-session retention. Keep defaults unless you are scaling job volume or integrating external webhook receivers.",
|
||||
"cron.enabled":
|
||||
"Enables cron job execution for stored schedules managed by the gateway. Keep enabled for normal reminder/automation flows, and disable only to pause all cron execution without deleting jobs.",
|
||||
@@ -1012,6 +1018,12 @@ export const FIELD_HELP: Record<string, string> = {
|
||||
"Bearer token attached to cron webhook POST deliveries when webhook mode is used. Prefer secret/env substitution and rotate this token regularly if shared webhook endpoints are internet-reachable.",
|
||||
"cron.sessionRetention":
|
||||
"Controls how long completed cron run sessions are kept before pruning (`24h`, `7d`, `1h30m`, or `false` to disable pruning; default: `24h`). Use shorter retention to reduce storage growth on high-frequency schedules.",
|
||||
"cron.runLog":
|
||||
"Pruning controls for per-job cron run history files under `cron/runs/<jobId>.jsonl`, including size and line retention.",
|
||||
"cron.runLog.maxBytes":
|
||||
"Maximum bytes per cron run-log file before pruning rewrites to the last keepLines entries (for example `2mb`, default `2000000`).",
|
||||
"cron.runLog.keepLines":
|
||||
"How many trailing run-log lines to retain when a file exceeds maxBytes (default `2000`). Increase for longer forensic history or lower for smaller disks.",
|
||||
hooks:
|
||||
"Inbound webhook automation surface for mapping external events into wake or agent actions in OpenClaw. Keep this locked down with explicit token/session/agent controls before exposing it beyond trusted networks.",
|
||||
"hooks.enabled":
|
||||
|
||||
@@ -471,6 +471,9 @@ export const FIELD_LABELS: Record<string, string> = {
|
||||
"session.maintenance.pruneDays": "Session Prune Days (Deprecated)",
|
||||
"session.maintenance.maxEntries": "Session Max Entries",
|
||||
"session.maintenance.rotateBytes": "Session Rotate Size",
|
||||
"session.maintenance.resetArchiveRetention": "Session Reset Archive Retention",
|
||||
"session.maintenance.maxDiskBytes": "Session Max Disk Budget",
|
||||
"session.maintenance.highWaterBytes": "Session Disk High-water Target",
|
||||
cron: "Cron",
|
||||
"cron.enabled": "Cron Enabled",
|
||||
"cron.store": "Cron Store Path",
|
||||
@@ -478,6 +481,9 @@ export const FIELD_LABELS: Record<string, string> = {
|
||||
"cron.webhook": "Cron Legacy Webhook (Deprecated)",
|
||||
"cron.webhookToken": "Cron Webhook Bearer Token",
|
||||
"cron.sessionRetention": "Cron Session Retention",
|
||||
"cron.runLog": "Cron Run Log Pruning",
|
||||
"cron.runLog.maxBytes": "Cron Run Log Max Bytes",
|
||||
"cron.runLog.keepLines": "Cron Run Log Keep Lines",
|
||||
hooks: "Hooks",
|
||||
"hooks.enabled": "Hooks Enabled",
|
||||
"hooks.path": "Hooks Endpoint Path",
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
export * from "./sessions/group.js";
|
||||
export * from "./sessions/artifacts.js";
|
||||
export * from "./sessions/metadata.js";
|
||||
export * from "./sessions/main-session.js";
|
||||
export * from "./sessions/paths.js";
|
||||
@@ -9,3 +10,4 @@ export * from "./sessions/types.js";
|
||||
export * from "./sessions/transcript.js";
|
||||
export * from "./sessions/session-file.js";
|
||||
export * from "./sessions/delivery-info.js";
|
||||
export * from "./sessions/disk-budget.js";
|
||||
|
||||
38
src/config/sessions/artifacts.test.ts
Normal file
38
src/config/sessions/artifacts.test.ts
Normal file
@@ -0,0 +1,38 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
formatSessionArchiveTimestamp,
|
||||
isPrimarySessionTranscriptFileName,
|
||||
isSessionArchiveArtifactName,
|
||||
parseSessionArchiveTimestamp,
|
||||
} from "./artifacts.js";
|
||||
|
||||
describe("session artifact helpers", () => {
|
||||
it("classifies archived artifact file names", () => {
|
||||
expect(isSessionArchiveArtifactName("abc.jsonl.deleted.2026-01-01T00-00-00.000Z")).toBe(true);
|
||||
expect(isSessionArchiveArtifactName("abc.jsonl.reset.2026-01-01T00-00-00.000Z")).toBe(true);
|
||||
expect(isSessionArchiveArtifactName("abc.jsonl.bak.2026-01-01T00-00-00.000Z")).toBe(true);
|
||||
expect(isSessionArchiveArtifactName("sessions.json.bak.1737420882")).toBe(true);
|
||||
expect(isSessionArchiveArtifactName("keep.deleted.keep.jsonl")).toBe(false);
|
||||
expect(isSessionArchiveArtifactName("abc.jsonl")).toBe(false);
|
||||
});
|
||||
|
||||
it("classifies primary transcript files", () => {
|
||||
expect(isPrimarySessionTranscriptFileName("abc.jsonl")).toBe(true);
|
||||
expect(isPrimarySessionTranscriptFileName("keep.deleted.keep.jsonl")).toBe(true);
|
||||
expect(isPrimarySessionTranscriptFileName("abc.jsonl.deleted.2026-01-01T00-00-00.000Z")).toBe(
|
||||
false,
|
||||
);
|
||||
expect(isPrimarySessionTranscriptFileName("sessions.json")).toBe(false);
|
||||
});
|
||||
|
||||
it("formats and parses archive timestamps", () => {
|
||||
const now = Date.parse("2026-02-23T12:34:56.000Z");
|
||||
const stamp = formatSessionArchiveTimestamp(now);
|
||||
expect(stamp).toBe("2026-02-23T12-34-56.000Z");
|
||||
|
||||
const file = `abc.jsonl.deleted.${stamp}`;
|
||||
expect(parseSessionArchiveTimestamp(file, "deleted")).toBe(now);
|
||||
expect(parseSessionArchiveTimestamp(file, "reset")).toBeNull();
|
||||
expect(parseSessionArchiveTimestamp("keep.deleted.keep.jsonl", "deleted")).toBeNull();
|
||||
});
|
||||
});
|
||||
67
src/config/sessions/artifacts.ts
Normal file
67
src/config/sessions/artifacts.ts
Normal file
@@ -0,0 +1,67 @@
|
||||
export type SessionArchiveReason = "bak" | "reset" | "deleted";
|
||||
|
||||
const ARCHIVE_TIMESTAMP_RE = /^\d{4}-\d{2}-\d{2}T\d{2}-\d{2}-\d{2}(?:\.\d{3})?Z$/;
|
||||
const LEGACY_STORE_BACKUP_RE = /^sessions\.json\.bak\.\d+$/;
|
||||
|
||||
function hasArchiveSuffix(fileName: string, reason: SessionArchiveReason): boolean {
|
||||
const marker = `.${reason}.`;
|
||||
const index = fileName.lastIndexOf(marker);
|
||||
if (index < 0) {
|
||||
return false;
|
||||
}
|
||||
const raw = fileName.slice(index + marker.length);
|
||||
return ARCHIVE_TIMESTAMP_RE.test(raw);
|
||||
}
|
||||
|
||||
export function isSessionArchiveArtifactName(fileName: string): boolean {
|
||||
if (LEGACY_STORE_BACKUP_RE.test(fileName)) {
|
||||
return true;
|
||||
}
|
||||
return (
|
||||
hasArchiveSuffix(fileName, "deleted") ||
|
||||
hasArchiveSuffix(fileName, "reset") ||
|
||||
hasArchiveSuffix(fileName, "bak")
|
||||
);
|
||||
}
|
||||
|
||||
export function isPrimarySessionTranscriptFileName(fileName: string): boolean {
|
||||
if (fileName === "sessions.json") {
|
||||
return false;
|
||||
}
|
||||
if (!fileName.endsWith(".jsonl")) {
|
||||
return false;
|
||||
}
|
||||
return !isSessionArchiveArtifactName(fileName);
|
||||
}
|
||||
|
||||
export function formatSessionArchiveTimestamp(nowMs = Date.now()): string {
|
||||
return new Date(nowMs).toISOString().replaceAll(":", "-");
|
||||
}
|
||||
|
||||
function restoreSessionArchiveTimestamp(raw: string): string {
|
||||
const [datePart, timePart] = raw.split("T");
|
||||
if (!datePart || !timePart) {
|
||||
return raw;
|
||||
}
|
||||
return `${datePart}T${timePart.replace(/-/g, ":")}`;
|
||||
}
|
||||
|
||||
export function parseSessionArchiveTimestamp(
|
||||
fileName: string,
|
||||
reason: SessionArchiveReason,
|
||||
): number | null {
|
||||
const marker = `.${reason}.`;
|
||||
const index = fileName.lastIndexOf(marker);
|
||||
if (index < 0) {
|
||||
return null;
|
||||
}
|
||||
const raw = fileName.slice(index + marker.length);
|
||||
if (!raw) {
|
||||
return null;
|
||||
}
|
||||
if (!ARCHIVE_TIMESTAMP_RE.test(raw)) {
|
||||
return null;
|
||||
}
|
||||
const timestamp = Date.parse(restoreSessionArchiveTimestamp(raw));
|
||||
return Number.isNaN(timestamp) ? null : timestamp;
|
||||
}
|
||||
95
src/config/sessions/disk-budget.test.ts
Normal file
95
src/config/sessions/disk-budget.test.ts
Normal file
@@ -0,0 +1,95 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import { afterEach, describe, expect, it } from "vitest";
|
||||
import { formatSessionArchiveTimestamp } from "./artifacts.js";
|
||||
import { enforceSessionDiskBudget } from "./disk-budget.js";
|
||||
import type { SessionEntry } from "./types.js";
|
||||
|
||||
const createdDirs: string[] = [];
|
||||
|
||||
async function createCaseDir(prefix: string): Promise<string> {
|
||||
const dir = await fs.mkdtemp(path.join(os.tmpdir(), prefix));
|
||||
createdDirs.push(dir);
|
||||
return dir;
|
||||
}
|
||||
|
||||
afterEach(async () => {
|
||||
await Promise.all(createdDirs.map((dir) => fs.rm(dir, { recursive: true, force: true })));
|
||||
createdDirs.length = 0;
|
||||
});
|
||||
|
||||
describe("enforceSessionDiskBudget", () => {
|
||||
it("does not treat referenced transcripts with marker-like session IDs as archived artifacts", async () => {
|
||||
const dir = await createCaseDir("openclaw-disk-budget-");
|
||||
const storePath = path.join(dir, "sessions.json");
|
||||
const sessionId = "keep.deleted.keep";
|
||||
const activeKey = "agent:main:main";
|
||||
const transcriptPath = path.join(dir, `${sessionId}.jsonl`);
|
||||
const store: Record<string, SessionEntry> = {
|
||||
[activeKey]: {
|
||||
sessionId,
|
||||
updatedAt: Date.now(),
|
||||
},
|
||||
};
|
||||
await fs.writeFile(storePath, JSON.stringify(store, null, 2), "utf-8");
|
||||
await fs.writeFile(transcriptPath, "x".repeat(256), "utf-8");
|
||||
|
||||
const result = await enforceSessionDiskBudget({
|
||||
store,
|
||||
storePath,
|
||||
activeSessionKey: activeKey,
|
||||
maintenance: {
|
||||
maxDiskBytes: 150,
|
||||
highWaterBytes: 100,
|
||||
},
|
||||
warnOnly: false,
|
||||
});
|
||||
|
||||
await expect(fs.stat(transcriptPath)).resolves.toBeDefined();
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
removedFiles: 0,
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it("removes true archived transcript artifacts while preserving referenced primary transcripts", async () => {
|
||||
const dir = await createCaseDir("openclaw-disk-budget-");
|
||||
const storePath = path.join(dir, "sessions.json");
|
||||
const sessionId = "keep";
|
||||
const transcriptPath = path.join(dir, `${sessionId}.jsonl`);
|
||||
const archivePath = path.join(
|
||||
dir,
|
||||
`old-session.jsonl.deleted.${formatSessionArchiveTimestamp(Date.now() - 24 * 60 * 60 * 1000)}`,
|
||||
);
|
||||
const store: Record<string, SessionEntry> = {
|
||||
"agent:main:main": {
|
||||
sessionId,
|
||||
updatedAt: Date.now(),
|
||||
},
|
||||
};
|
||||
await fs.writeFile(storePath, JSON.stringify(store, null, 2), "utf-8");
|
||||
await fs.writeFile(transcriptPath, "k".repeat(80), "utf-8");
|
||||
await fs.writeFile(archivePath, "a".repeat(260), "utf-8");
|
||||
|
||||
const result = await enforceSessionDiskBudget({
|
||||
store,
|
||||
storePath,
|
||||
maintenance: {
|
||||
maxDiskBytes: 300,
|
||||
highWaterBytes: 220,
|
||||
},
|
||||
warnOnly: false,
|
||||
});
|
||||
|
||||
await expect(fs.stat(transcriptPath)).resolves.toBeDefined();
|
||||
await expect(fs.stat(archivePath)).rejects.toThrow();
|
||||
expect(result).toEqual(
|
||||
expect.objectContaining({
|
||||
removedFiles: 1,
|
||||
removedEntries: 0,
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
375
src/config/sessions/disk-budget.ts
Normal file
375
src/config/sessions/disk-budget.ts
Normal file
@@ -0,0 +1,375 @@
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import { isPrimarySessionTranscriptFileName, isSessionArchiveArtifactName } from "./artifacts.js";
|
||||
import { resolveSessionFilePath } from "./paths.js";
|
||||
import type { SessionEntry } from "./types.js";
|
||||
|
||||
export type SessionDiskBudgetConfig = {
|
||||
maxDiskBytes: number | null;
|
||||
highWaterBytes: number | null;
|
||||
};
|
||||
|
||||
export type SessionDiskBudgetSweepResult = {
|
||||
totalBytesBefore: number;
|
||||
totalBytesAfter: number;
|
||||
removedFiles: number;
|
||||
removedEntries: number;
|
||||
freedBytes: number;
|
||||
maxBytes: number;
|
||||
highWaterBytes: number;
|
||||
overBudget: boolean;
|
||||
};
|
||||
|
||||
export type SessionDiskBudgetLogger = {
|
||||
warn: (message: string, context?: Record<string, unknown>) => void;
|
||||
info: (message: string, context?: Record<string, unknown>) => void;
|
||||
};
|
||||
|
||||
const NOOP_LOGGER: SessionDiskBudgetLogger = {
|
||||
warn: () => {},
|
||||
info: () => {},
|
||||
};
|
||||
|
||||
type SessionsDirFileStat = {
|
||||
path: string;
|
||||
canonicalPath: string;
|
||||
name: string;
|
||||
size: number;
|
||||
mtimeMs: number;
|
||||
};
|
||||
|
||||
function canonicalizePathForComparison(filePath: string): string {
|
||||
const resolved = path.resolve(filePath);
|
||||
try {
|
||||
return fs.realpathSync(resolved);
|
||||
} catch {
|
||||
return resolved;
|
||||
}
|
||||
}
|
||||
|
||||
function measureStoreBytes(store: Record<string, SessionEntry>): number {
|
||||
return Buffer.byteLength(JSON.stringify(store, null, 2), "utf-8");
|
||||
}
|
||||
|
||||
function measureStoreEntryChunkBytes(key: string, entry: SessionEntry): number {
|
||||
const singleEntryStore = JSON.stringify({ [key]: entry }, null, 2);
|
||||
if (!singleEntryStore.startsWith("{\n") || !singleEntryStore.endsWith("\n}")) {
|
||||
return measureStoreBytes({ [key]: entry }) - 4;
|
||||
}
|
||||
const chunk = singleEntryStore.slice(2, -2);
|
||||
return Buffer.byteLength(chunk, "utf-8");
|
||||
}
|
||||
|
||||
function buildStoreEntryChunkSizeMap(store: Record<string, SessionEntry>): Map<string, number> {
|
||||
const out = new Map<string, number>();
|
||||
for (const [key, entry] of Object.entries(store)) {
|
||||
out.set(key, measureStoreEntryChunkBytes(key, entry));
|
||||
}
|
||||
return out;
|
||||
}
|
||||
|
||||
function getEntryUpdatedAt(entry?: SessionEntry): number {
|
||||
if (!entry) {
|
||||
return 0;
|
||||
}
|
||||
const updatedAt = entry.updatedAt;
|
||||
return Number.isFinite(updatedAt) ? updatedAt : 0;
|
||||
}
|
||||
|
||||
function buildSessionIdRefCounts(store: Record<string, SessionEntry>): Map<string, number> {
|
||||
const counts = new Map<string, number>();
|
||||
for (const entry of Object.values(store)) {
|
||||
const sessionId = entry?.sessionId;
|
||||
if (!sessionId) {
|
||||
continue;
|
||||
}
|
||||
counts.set(sessionId, (counts.get(sessionId) ?? 0) + 1);
|
||||
}
|
||||
return counts;
|
||||
}
|
||||
|
||||
function resolveSessionTranscriptPathForEntry(params: {
|
||||
sessionsDir: string;
|
||||
entry: SessionEntry;
|
||||
}): string | null {
|
||||
if (!params.entry.sessionId) {
|
||||
return null;
|
||||
}
|
||||
try {
|
||||
const resolved = resolveSessionFilePath(params.entry.sessionId, params.entry, {
|
||||
sessionsDir: params.sessionsDir,
|
||||
});
|
||||
const resolvedSessionsDir = canonicalizePathForComparison(params.sessionsDir);
|
||||
const resolvedPath = canonicalizePathForComparison(resolved);
|
||||
const relative = path.relative(resolvedSessionsDir, resolvedPath);
|
||||
if (!relative || relative.startsWith("..") || path.isAbsolute(relative)) {
|
||||
return null;
|
||||
}
|
||||
return resolvedPath;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveReferencedSessionTranscriptPaths(params: {
|
||||
sessionsDir: string;
|
||||
store: Record<string, SessionEntry>;
|
||||
}): Set<string> {
|
||||
const referenced = new Set<string>();
|
||||
for (const entry of Object.values(params.store)) {
|
||||
const resolved = resolveSessionTranscriptPathForEntry({
|
||||
sessionsDir: params.sessionsDir,
|
||||
entry,
|
||||
});
|
||||
if (resolved) {
|
||||
referenced.add(canonicalizePathForComparison(resolved));
|
||||
}
|
||||
}
|
||||
return referenced;
|
||||
}
|
||||
|
||||
async function readSessionsDirFiles(sessionsDir: string): Promise<SessionsDirFileStat[]> {
|
||||
const dirEntries = await fs.promises
|
||||
.readdir(sessionsDir, { withFileTypes: true })
|
||||
.catch(() => []);
|
||||
const files: SessionsDirFileStat[] = [];
|
||||
for (const dirent of dirEntries) {
|
||||
if (!dirent.isFile()) {
|
||||
continue;
|
||||
}
|
||||
const filePath = path.join(sessionsDir, dirent.name);
|
||||
const stat = await fs.promises.stat(filePath).catch(() => null);
|
||||
if (!stat?.isFile()) {
|
||||
continue;
|
||||
}
|
||||
files.push({
|
||||
path: filePath,
|
||||
canonicalPath: canonicalizePathForComparison(filePath),
|
||||
name: dirent.name,
|
||||
size: stat.size,
|
||||
mtimeMs: stat.mtimeMs,
|
||||
});
|
||||
}
|
||||
return files;
|
||||
}
|
||||
|
||||
async function removeFileIfExists(filePath: string): Promise<number> {
|
||||
const stat = await fs.promises.stat(filePath).catch(() => null);
|
||||
if (!stat?.isFile()) {
|
||||
return 0;
|
||||
}
|
||||
await fs.promises.rm(filePath, { force: true }).catch(() => undefined);
|
||||
return stat.size;
|
||||
}
|
||||
|
||||
async function removeFileForBudget(params: {
|
||||
filePath: string;
|
||||
canonicalPath?: string;
|
||||
dryRun: boolean;
|
||||
fileSizesByPath: Map<string, number>;
|
||||
simulatedRemovedPaths: Set<string>;
|
||||
}): Promise<number> {
|
||||
const resolvedPath = path.resolve(params.filePath);
|
||||
const canonicalPath = params.canonicalPath ?? canonicalizePathForComparison(resolvedPath);
|
||||
if (params.dryRun) {
|
||||
if (params.simulatedRemovedPaths.has(canonicalPath)) {
|
||||
return 0;
|
||||
}
|
||||
const size = params.fileSizesByPath.get(canonicalPath) ?? 0;
|
||||
if (size <= 0) {
|
||||
return 0;
|
||||
}
|
||||
params.simulatedRemovedPaths.add(canonicalPath);
|
||||
return size;
|
||||
}
|
||||
return removeFileIfExists(resolvedPath);
|
||||
}
|
||||
|
||||
export async function enforceSessionDiskBudget(params: {
|
||||
store: Record<string, SessionEntry>;
|
||||
storePath: string;
|
||||
activeSessionKey?: string;
|
||||
maintenance: SessionDiskBudgetConfig;
|
||||
warnOnly: boolean;
|
||||
dryRun?: boolean;
|
||||
log?: SessionDiskBudgetLogger;
|
||||
}): Promise<SessionDiskBudgetSweepResult | null> {
|
||||
const maxBytes = params.maintenance.maxDiskBytes;
|
||||
const highWaterBytes = params.maintenance.highWaterBytes;
|
||||
if (maxBytes == null || highWaterBytes == null) {
|
||||
return null;
|
||||
}
|
||||
const log = params.log ?? NOOP_LOGGER;
|
||||
const dryRun = params.dryRun === true;
|
||||
const sessionsDir = path.dirname(params.storePath);
|
||||
const files = await readSessionsDirFiles(sessionsDir);
|
||||
const fileSizesByPath = new Map(files.map((file) => [file.canonicalPath, file.size]));
|
||||
const simulatedRemovedPaths = new Set<string>();
|
||||
const resolvedStorePath = canonicalizePathForComparison(params.storePath);
|
||||
const storeFile = files.find((file) => file.canonicalPath === resolvedStorePath);
|
||||
let projectedStoreBytes = measureStoreBytes(params.store);
|
||||
let total =
|
||||
files.reduce((sum, file) => sum + file.size, 0) - (storeFile?.size ?? 0) + projectedStoreBytes;
|
||||
const totalBefore = total;
|
||||
if (total <= maxBytes) {
|
||||
return {
|
||||
totalBytesBefore: totalBefore,
|
||||
totalBytesAfter: total,
|
||||
removedFiles: 0,
|
||||
removedEntries: 0,
|
||||
freedBytes: 0,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
overBudget: false,
|
||||
};
|
||||
}
|
||||
|
||||
if (params.warnOnly) {
|
||||
log.warn("session disk budget exceeded (warn-only mode)", {
|
||||
sessionsDir,
|
||||
totalBytes: total,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
});
|
||||
return {
|
||||
totalBytesBefore: totalBefore,
|
||||
totalBytesAfter: total,
|
||||
removedFiles: 0,
|
||||
removedEntries: 0,
|
||||
freedBytes: 0,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
overBudget: true,
|
||||
};
|
||||
}
|
||||
|
||||
let removedFiles = 0;
|
||||
let removedEntries = 0;
|
||||
let freedBytes = 0;
|
||||
|
||||
const referencedPaths = resolveReferencedSessionTranscriptPaths({
|
||||
sessionsDir,
|
||||
store: params.store,
|
||||
});
|
||||
const removableFileQueue = files
|
||||
.filter(
|
||||
(file) =>
|
||||
isSessionArchiveArtifactName(file.name) ||
|
||||
(isPrimarySessionTranscriptFileName(file.name) && !referencedPaths.has(file.canonicalPath)),
|
||||
)
|
||||
.toSorted((a, b) => a.mtimeMs - b.mtimeMs);
|
||||
for (const file of removableFileQueue) {
|
||||
if (total <= highWaterBytes) {
|
||||
break;
|
||||
}
|
||||
const deletedBytes = await removeFileForBudget({
|
||||
filePath: file.path,
|
||||
canonicalPath: file.canonicalPath,
|
||||
dryRun,
|
||||
fileSizesByPath,
|
||||
simulatedRemovedPaths,
|
||||
});
|
||||
if (deletedBytes <= 0) {
|
||||
continue;
|
||||
}
|
||||
total -= deletedBytes;
|
||||
freedBytes += deletedBytes;
|
||||
removedFiles += 1;
|
||||
}
|
||||
|
||||
if (total > highWaterBytes) {
|
||||
const activeSessionKey = params.activeSessionKey?.trim().toLowerCase();
|
||||
const sessionIdRefCounts = buildSessionIdRefCounts(params.store);
|
||||
const entryChunkBytesByKey = buildStoreEntryChunkSizeMap(params.store);
|
||||
const keys = Object.keys(params.store).toSorted((a, b) => {
|
||||
const aTime = getEntryUpdatedAt(params.store[a]);
|
||||
const bTime = getEntryUpdatedAt(params.store[b]);
|
||||
return aTime - bTime;
|
||||
});
|
||||
for (const key of keys) {
|
||||
if (total <= highWaterBytes) {
|
||||
break;
|
||||
}
|
||||
if (activeSessionKey && key.trim().toLowerCase() === activeSessionKey) {
|
||||
continue;
|
||||
}
|
||||
const entry = params.store[key];
|
||||
if (!entry) {
|
||||
continue;
|
||||
}
|
||||
const previousProjectedBytes = projectedStoreBytes;
|
||||
delete params.store[key];
|
||||
const chunkBytes = entryChunkBytesByKey.get(key);
|
||||
entryChunkBytesByKey.delete(key);
|
||||
if (typeof chunkBytes === "number" && Number.isFinite(chunkBytes) && chunkBytes >= 0) {
|
||||
// Removing any one pretty-printed top-level entry always removes the entry chunk plus ",\n" (2 bytes).
|
||||
projectedStoreBytes = Math.max(2, projectedStoreBytes - (chunkBytes + 2));
|
||||
} else {
|
||||
projectedStoreBytes = measureStoreBytes(params.store);
|
||||
}
|
||||
total += projectedStoreBytes - previousProjectedBytes;
|
||||
removedEntries += 1;
|
||||
|
||||
const sessionId = entry.sessionId;
|
||||
if (!sessionId) {
|
||||
continue;
|
||||
}
|
||||
const nextRefCount = (sessionIdRefCounts.get(sessionId) ?? 1) - 1;
|
||||
if (nextRefCount > 0) {
|
||||
sessionIdRefCounts.set(sessionId, nextRefCount);
|
||||
continue;
|
||||
}
|
||||
sessionIdRefCounts.delete(sessionId);
|
||||
const transcriptPath = resolveSessionTranscriptPathForEntry({ sessionsDir, entry });
|
||||
if (!transcriptPath) {
|
||||
continue;
|
||||
}
|
||||
const deletedBytes = await removeFileForBudget({
|
||||
filePath: transcriptPath,
|
||||
dryRun,
|
||||
fileSizesByPath,
|
||||
simulatedRemovedPaths,
|
||||
});
|
||||
if (deletedBytes <= 0) {
|
||||
continue;
|
||||
}
|
||||
total -= deletedBytes;
|
||||
freedBytes += deletedBytes;
|
||||
removedFiles += 1;
|
||||
}
|
||||
}
|
||||
|
||||
if (!dryRun) {
|
||||
if (total > highWaterBytes) {
|
||||
log.warn("session disk budget still above high-water target after cleanup", {
|
||||
sessionsDir,
|
||||
totalBytes: total,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
removedFiles,
|
||||
removedEntries,
|
||||
});
|
||||
} else if (removedFiles > 0 || removedEntries > 0) {
|
||||
log.info("applied session disk budget cleanup", {
|
||||
sessionsDir,
|
||||
totalBytesBefore: totalBefore,
|
||||
totalBytesAfter: total,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
removedFiles,
|
||||
removedEntries,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
totalBytesBefore: totalBefore,
|
||||
totalBytesAfter: total,
|
||||
removedFiles,
|
||||
removedEntries,
|
||||
freedBytes,
|
||||
maxBytes,
|
||||
highWaterBytes,
|
||||
overBudget: true,
|
||||
};
|
||||
}
|
||||
@@ -159,6 +159,40 @@ describe("Integration: saveSessionStore with pruning", () => {
|
||||
await expect(fs.stat(bakArchived)).resolves.toBeDefined();
|
||||
});
|
||||
|
||||
it("cleans up reset archives using resetArchiveRetention", async () => {
|
||||
mockLoadConfig.mockReturnValue({
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "30d",
|
||||
resetArchiveRetention: "3d",
|
||||
maxEntries: 500,
|
||||
rotateBytes: 10_485_760,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const now = Date.now();
|
||||
const store: Record<string, SessionEntry> = {
|
||||
fresh: { sessionId: "fresh-session", updatedAt: now },
|
||||
};
|
||||
const oldReset = path.join(
|
||||
testDir,
|
||||
`old-reset.jsonl.reset.${archiveTimestamp(now - 10 * DAY_MS)}`,
|
||||
);
|
||||
const freshReset = path.join(
|
||||
testDir,
|
||||
`fresh-reset.jsonl.reset.${archiveTimestamp(now - 1 * DAY_MS)}`,
|
||||
);
|
||||
await fs.writeFile(oldReset, "old", "utf-8");
|
||||
await fs.writeFile(freshReset, "fresh", "utf-8");
|
||||
|
||||
await saveSessionStore(storePath, store);
|
||||
|
||||
await expect(fs.stat(oldReset)).rejects.toThrow();
|
||||
await expect(fs.stat(freshReset)).resolves.toBeDefined();
|
||||
});
|
||||
|
||||
it("saveSessionStore skips enforcement when maintenance mode is warn", async () => {
|
||||
mockLoadConfig.mockReturnValue({
|
||||
session: {
|
||||
@@ -180,4 +214,181 @@ describe("Integration: saveSessionStore with pruning", () => {
|
||||
expect(loaded.fresh).toBeDefined();
|
||||
expect(Object.keys(loaded)).toHaveLength(2);
|
||||
});
|
||||
|
||||
it("archives transcript files for entries evicted by maxEntries capping", async () => {
|
||||
mockLoadConfig.mockReturnValue({
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "365d",
|
||||
maxEntries: 1,
|
||||
rotateBytes: 10_485_760,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const now = Date.now();
|
||||
const oldestSessionId = "oldest-session";
|
||||
const newestSessionId = "newest-session";
|
||||
const store: Record<string, SessionEntry> = {
|
||||
oldest: { sessionId: oldestSessionId, updatedAt: now - DAY_MS },
|
||||
newest: { sessionId: newestSessionId, updatedAt: now },
|
||||
};
|
||||
const oldestTranscript = path.join(testDir, `${oldestSessionId}.jsonl`);
|
||||
const newestTranscript = path.join(testDir, `${newestSessionId}.jsonl`);
|
||||
await fs.writeFile(oldestTranscript, '{"type":"session"}\n', "utf-8");
|
||||
await fs.writeFile(newestTranscript, '{"type":"session"}\n', "utf-8");
|
||||
|
||||
await saveSessionStore(storePath, store);
|
||||
|
||||
const loaded = loadSessionStore(storePath);
|
||||
expect(loaded.oldest).toBeUndefined();
|
||||
expect(loaded.newest).toBeDefined();
|
||||
await expect(fs.stat(oldestTranscript)).rejects.toThrow();
|
||||
await expect(fs.stat(newestTranscript)).resolves.toBeDefined();
|
||||
const files = await fs.readdir(testDir);
|
||||
expect(files.some((name) => name.startsWith(`${oldestSessionId}.jsonl.deleted.`))).toBe(true);
|
||||
});
|
||||
|
||||
it("does not archive external transcript paths when capping entries", async () => {
|
||||
mockLoadConfig.mockReturnValue({
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "365d",
|
||||
maxEntries: 1,
|
||||
rotateBytes: 10_485_760,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const now = Date.now();
|
||||
const externalDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-external-cap-"));
|
||||
const externalTranscript = path.join(externalDir, "outside.jsonl");
|
||||
await fs.writeFile(externalTranscript, "external", "utf-8");
|
||||
const store: Record<string, SessionEntry> = {
|
||||
oldest: {
|
||||
sessionId: "outside",
|
||||
sessionFile: externalTranscript,
|
||||
updatedAt: now - DAY_MS,
|
||||
},
|
||||
newest: { sessionId: "inside", updatedAt: now },
|
||||
};
|
||||
await fs.writeFile(path.join(testDir, "inside.jsonl"), '{"type":"session"}\n', "utf-8");
|
||||
|
||||
try {
|
||||
await saveSessionStore(storePath, store);
|
||||
const loaded = loadSessionStore(storePath);
|
||||
expect(loaded.oldest).toBeUndefined();
|
||||
expect(loaded.newest).toBeDefined();
|
||||
await expect(fs.stat(externalTranscript)).resolves.toBeDefined();
|
||||
} finally {
|
||||
await fs.rm(externalDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it("enforces maxDiskBytes with oldest-first session eviction", async () => {
|
||||
mockLoadConfig.mockReturnValue({
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "365d",
|
||||
maxEntries: 100,
|
||||
rotateBytes: 10_485_760,
|
||||
maxDiskBytes: 900,
|
||||
highWaterBytes: 700,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const now = Date.now();
|
||||
const oldSessionId = "old-disk-session";
|
||||
const newSessionId = "new-disk-session";
|
||||
const store: Record<string, SessionEntry> = {
|
||||
old: { sessionId: oldSessionId, updatedAt: now - DAY_MS },
|
||||
recent: { sessionId: newSessionId, updatedAt: now },
|
||||
};
|
||||
await fs.writeFile(path.join(testDir, `${oldSessionId}.jsonl`), "x".repeat(500), "utf-8");
|
||||
await fs.writeFile(path.join(testDir, `${newSessionId}.jsonl`), "y".repeat(500), "utf-8");
|
||||
|
||||
await saveSessionStore(storePath, store);
|
||||
|
||||
const loaded = loadSessionStore(storePath);
|
||||
expect(Object.keys(loaded).length).toBe(1);
|
||||
expect(loaded.recent).toBeDefined();
|
||||
await expect(fs.stat(path.join(testDir, `${oldSessionId}.jsonl`))).rejects.toThrow();
|
||||
await expect(fs.stat(path.join(testDir, `${newSessionId}.jsonl`))).resolves.toBeDefined();
|
||||
});
|
||||
|
||||
it("uses projected sessions.json size to avoid over-eviction", async () => {
|
||||
mockLoadConfig.mockReturnValue({
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "365d",
|
||||
maxEntries: 100,
|
||||
rotateBytes: 10_485_760,
|
||||
maxDiskBytes: 900,
|
||||
highWaterBytes: 700,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Simulate a stale oversized on-disk sessions.json from a previous write.
|
||||
await fs.writeFile(storePath, JSON.stringify({ noisy: "x".repeat(10_000) }), "utf-8");
|
||||
|
||||
const now = Date.now();
|
||||
const store: Record<string, SessionEntry> = {
|
||||
older: { sessionId: "older", updatedAt: now - DAY_MS },
|
||||
newer: { sessionId: "newer", updatedAt: now },
|
||||
};
|
||||
await fs.writeFile(path.join(testDir, "older.jsonl"), "x".repeat(80), "utf-8");
|
||||
await fs.writeFile(path.join(testDir, "newer.jsonl"), "y".repeat(80), "utf-8");
|
||||
|
||||
await saveSessionStore(storePath, store);
|
||||
|
||||
const loaded = loadSessionStore(storePath);
|
||||
expect(loaded.older).toBeDefined();
|
||||
expect(loaded.newer).toBeDefined();
|
||||
});
|
||||
|
||||
it("never deletes transcripts outside the agent sessions directory during budget cleanup", async () => {
|
||||
mockLoadConfig.mockReturnValue({
|
||||
session: {
|
||||
maintenance: {
|
||||
mode: "enforce",
|
||||
pruneAfter: "365d",
|
||||
maxEntries: 100,
|
||||
rotateBytes: 10_485_760,
|
||||
maxDiskBytes: 500,
|
||||
highWaterBytes: 300,
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
const now = Date.now();
|
||||
const externalDir = await fs.mkdtemp(path.join(os.tmpdir(), "openclaw-external-session-"));
|
||||
const externalTranscript = path.join(externalDir, "outside.jsonl");
|
||||
await fs.writeFile(externalTranscript, "z".repeat(400), "utf-8");
|
||||
|
||||
const store: Record<string, SessionEntry> = {
|
||||
older: {
|
||||
sessionId: "outside",
|
||||
sessionFile: externalTranscript,
|
||||
updatedAt: now - DAY_MS,
|
||||
},
|
||||
newer: {
|
||||
sessionId: "inside",
|
||||
updatedAt: now,
|
||||
},
|
||||
};
|
||||
await fs.writeFile(path.join(testDir, "inside.jsonl"), "i".repeat(400), "utf-8");
|
||||
|
||||
try {
|
||||
await saveSessionStore(storePath, store);
|
||||
await expect(fs.stat(externalTranscript)).resolves.toBeDefined();
|
||||
} finally {
|
||||
await fs.rm(externalDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
@@ -20,6 +20,7 @@ import {
|
||||
import { getFileMtimeMs, isCacheEnabled, resolveCacheTtlMs } from "../cache-utils.js";
|
||||
import { loadConfig } from "../config.js";
|
||||
import type { SessionMaintenanceConfig, SessionMaintenanceMode } from "../types.base.js";
|
||||
import { enforceSessionDiskBudget, type SessionDiskBudgetSweepResult } from "./disk-budget.js";
|
||||
import { deriveSessionMetaPatch } from "./metadata.js";
|
||||
import { mergeSessionEntry, type SessionEntry } from "./types.js";
|
||||
|
||||
@@ -299,6 +300,7 @@ const DEFAULT_SESSION_PRUNE_AFTER_MS = 30 * 24 * 60 * 60 * 1000;
|
||||
const DEFAULT_SESSION_MAX_ENTRIES = 500;
|
||||
const DEFAULT_SESSION_ROTATE_BYTES = 10_485_760; // 10 MB
|
||||
const DEFAULT_SESSION_MAINTENANCE_MODE: SessionMaintenanceMode = "warn";
|
||||
const DEFAULT_SESSION_DISK_BUDGET_HIGH_WATER_RATIO = 0.8;
|
||||
|
||||
export type SessionMaintenanceWarning = {
|
||||
activeSessionKey: string;
|
||||
@@ -310,11 +312,23 @@ export type SessionMaintenanceWarning = {
|
||||
wouldCap: boolean;
|
||||
};
|
||||
|
||||
export type SessionMaintenanceApplyReport = {
|
||||
mode: SessionMaintenanceMode;
|
||||
beforeCount: number;
|
||||
afterCount: number;
|
||||
pruned: number;
|
||||
capped: number;
|
||||
diskBudget: SessionDiskBudgetSweepResult | null;
|
||||
};
|
||||
|
||||
type ResolvedSessionMaintenanceConfig = {
|
||||
mode: SessionMaintenanceMode;
|
||||
pruneAfterMs: number;
|
||||
maxEntries: number;
|
||||
rotateBytes: number;
|
||||
resetArchiveRetentionMs: number | null;
|
||||
maxDiskBytes: number | null;
|
||||
highWaterBytes: number | null;
|
||||
};
|
||||
|
||||
function resolvePruneAfterMs(maintenance?: SessionMaintenanceConfig): number {
|
||||
@@ -341,6 +355,70 @@ function resolveRotateBytes(maintenance?: SessionMaintenanceConfig): number {
|
||||
}
|
||||
}
|
||||
|
||||
function resolveResetArchiveRetentionMs(
|
||||
maintenance: SessionMaintenanceConfig | undefined,
|
||||
pruneAfterMs: number,
|
||||
): number | null {
|
||||
const raw = maintenance?.resetArchiveRetention;
|
||||
if (raw === false) {
|
||||
return null;
|
||||
}
|
||||
if (raw === undefined || raw === null || raw === "") {
|
||||
return pruneAfterMs;
|
||||
}
|
||||
try {
|
||||
return parseDurationMs(String(raw).trim(), { defaultUnit: "d" });
|
||||
} catch {
|
||||
return pruneAfterMs;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveMaxDiskBytes(maintenance?: SessionMaintenanceConfig): number | null {
|
||||
const raw = maintenance?.maxDiskBytes;
|
||||
if (raw === undefined || raw === null || raw === "") {
|
||||
return null;
|
||||
}
|
||||
try {
|
||||
return parseByteSize(String(raw).trim(), { defaultUnit: "b" });
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function resolveHighWaterBytes(
|
||||
maintenance: SessionMaintenanceConfig | undefined,
|
||||
maxDiskBytes: number | null,
|
||||
): number | null {
|
||||
const computeDefault = () => {
|
||||
if (maxDiskBytes == null) {
|
||||
return null;
|
||||
}
|
||||
if (maxDiskBytes <= 0) {
|
||||
return 0;
|
||||
}
|
||||
return Math.max(
|
||||
1,
|
||||
Math.min(
|
||||
maxDiskBytes,
|
||||
Math.floor(maxDiskBytes * DEFAULT_SESSION_DISK_BUDGET_HIGH_WATER_RATIO),
|
||||
),
|
||||
);
|
||||
};
|
||||
if (maxDiskBytes == null) {
|
||||
return null;
|
||||
}
|
||||
const raw = maintenance?.highWaterBytes;
|
||||
if (raw === undefined || raw === null || raw === "") {
|
||||
return computeDefault();
|
||||
}
|
||||
try {
|
||||
const parsed = parseByteSize(String(raw).trim(), { defaultUnit: "b" });
|
||||
return Math.min(parsed, maxDiskBytes);
|
||||
} catch {
|
||||
return computeDefault();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve maintenance settings from openclaw.json (`session.maintenance`).
|
||||
* Falls back to built-in defaults when config is missing or unset.
|
||||
@@ -352,11 +430,16 @@ export function resolveMaintenanceConfig(): ResolvedSessionMaintenanceConfig {
|
||||
} catch {
|
||||
// Config may not be available (e.g. in tests). Use defaults.
|
||||
}
|
||||
const pruneAfterMs = resolvePruneAfterMs(maintenance);
|
||||
const maxDiskBytes = resolveMaxDiskBytes(maintenance);
|
||||
return {
|
||||
mode: maintenance?.mode ?? DEFAULT_SESSION_MAINTENANCE_MODE,
|
||||
pruneAfterMs: resolvePruneAfterMs(maintenance),
|
||||
pruneAfterMs,
|
||||
maxEntries: maintenance?.maxEntries ?? DEFAULT_SESSION_MAX_ENTRIES,
|
||||
rotateBytes: resolveRotateBytes(maintenance),
|
||||
resetArchiveRetentionMs: resolveResetArchiveRetentionMs(maintenance, pruneAfterMs),
|
||||
maxDiskBytes,
|
||||
highWaterBytes: resolveHighWaterBytes(maintenance, maxDiskBytes),
|
||||
};
|
||||
}
|
||||
|
||||
@@ -439,7 +522,10 @@ export function getActiveSessionMaintenanceWarning(params: {
|
||||
export function capEntryCount(
|
||||
store: Record<string, SessionEntry>,
|
||||
overrideMax?: number,
|
||||
opts: { log?: boolean } = {},
|
||||
opts: {
|
||||
log?: boolean;
|
||||
onCapped?: (params: { key: string; entry: SessionEntry }) => void;
|
||||
} = {},
|
||||
): number {
|
||||
const maxEntries = overrideMax ?? resolveMaintenanceConfig().maxEntries;
|
||||
const keys = Object.keys(store);
|
||||
@@ -456,6 +542,10 @@ export function capEntryCount(
|
||||
|
||||
const toRemove = sorted.slice(maxEntries);
|
||||
for (const key of toRemove) {
|
||||
const entry = store[key];
|
||||
if (entry) {
|
||||
opts.onCapped?.({ key, entry });
|
||||
}
|
||||
delete store[key];
|
||||
}
|
||||
if (opts.log !== false) {
|
||||
@@ -539,6 +629,10 @@ type SaveSessionStoreOptions = {
|
||||
activeSessionKey?: string;
|
||||
/** Optional callback for warn-only maintenance. */
|
||||
onWarn?: (warning: SessionMaintenanceWarning) => void | Promise<void>;
|
||||
/** Optional callback with maintenance stats after a save. */
|
||||
onMaintenanceApplied?: (report: SessionMaintenanceApplyReport) => void | Promise<void>;
|
||||
/** Optional overrides used by maintenance commands. */
|
||||
maintenanceOverride?: Partial<ResolvedSessionMaintenanceConfig>;
|
||||
};
|
||||
|
||||
async function saveSessionStoreUnlocked(
|
||||
@@ -553,8 +647,9 @@ async function saveSessionStoreUnlocked(
|
||||
|
||||
if (!opts?.skipMaintenance) {
|
||||
// Resolve maintenance config once (avoids repeated loadConfig() calls).
|
||||
const maintenance = resolveMaintenanceConfig();
|
||||
const maintenance = { ...resolveMaintenanceConfig(), ...opts?.maintenanceOverride };
|
||||
const shouldWarnOnly = maintenance.mode === "warn";
|
||||
const beforeCount = Object.keys(store).length;
|
||||
|
||||
if (shouldWarnOnly) {
|
||||
const activeSessionKey = opts?.activeSessionKey?.trim();
|
||||
@@ -576,39 +671,96 @@ async function saveSessionStoreUnlocked(
|
||||
await opts?.onWarn?.(warning);
|
||||
}
|
||||
}
|
||||
const diskBudget = await enforceSessionDiskBudget({
|
||||
store,
|
||||
storePath,
|
||||
activeSessionKey: opts?.activeSessionKey,
|
||||
maintenance,
|
||||
warnOnly: true,
|
||||
log,
|
||||
});
|
||||
await opts?.onMaintenanceApplied?.({
|
||||
mode: maintenance.mode,
|
||||
beforeCount,
|
||||
afterCount: Object.keys(store).length,
|
||||
pruned: 0,
|
||||
capped: 0,
|
||||
diskBudget,
|
||||
});
|
||||
} else {
|
||||
// Prune stale entries and cap total count before serializing.
|
||||
const prunedSessionFiles = new Map<string, string | undefined>();
|
||||
pruneStaleEntries(store, maintenance.pruneAfterMs, {
|
||||
const removedSessionFiles = new Map<string, string | undefined>();
|
||||
const pruned = pruneStaleEntries(store, maintenance.pruneAfterMs, {
|
||||
onPruned: ({ entry }) => {
|
||||
if (!prunedSessionFiles.has(entry.sessionId) || entry.sessionFile) {
|
||||
prunedSessionFiles.set(entry.sessionId, entry.sessionFile);
|
||||
if (!removedSessionFiles.has(entry.sessionId) || entry.sessionFile) {
|
||||
removedSessionFiles.set(entry.sessionId, entry.sessionFile);
|
||||
}
|
||||
},
|
||||
});
|
||||
const capped = capEntryCount(store, maintenance.maxEntries, {
|
||||
onCapped: ({ entry }) => {
|
||||
if (!removedSessionFiles.has(entry.sessionId) || entry.sessionFile) {
|
||||
removedSessionFiles.set(entry.sessionId, entry.sessionFile);
|
||||
}
|
||||
},
|
||||
});
|
||||
capEntryCount(store, maintenance.maxEntries);
|
||||
const archivedDirs = new Set<string>();
|
||||
for (const [sessionId, sessionFile] of prunedSessionFiles) {
|
||||
const referencedSessionIds = new Set(
|
||||
Object.values(store)
|
||||
.map((entry) => entry?.sessionId)
|
||||
.filter((id): id is string => Boolean(id)),
|
||||
);
|
||||
for (const [sessionId, sessionFile] of removedSessionFiles) {
|
||||
if (referencedSessionIds.has(sessionId)) {
|
||||
continue;
|
||||
}
|
||||
const archived = archiveSessionTranscripts({
|
||||
sessionId,
|
||||
storePath,
|
||||
sessionFile,
|
||||
reason: "deleted",
|
||||
restrictToStoreDir: true,
|
||||
});
|
||||
for (const archivedPath of archived) {
|
||||
archivedDirs.add(path.dirname(archivedPath));
|
||||
}
|
||||
}
|
||||
if (archivedDirs.size > 0) {
|
||||
if (archivedDirs.size > 0 || maintenance.resetArchiveRetentionMs != null) {
|
||||
const targetDirs =
|
||||
archivedDirs.size > 0 ? [...archivedDirs] : [path.dirname(path.resolve(storePath))];
|
||||
await cleanupArchivedSessionTranscripts({
|
||||
directories: [...archivedDirs],
|
||||
directories: targetDirs,
|
||||
olderThanMs: maintenance.pruneAfterMs,
|
||||
reason: "deleted",
|
||||
});
|
||||
if (maintenance.resetArchiveRetentionMs != null) {
|
||||
await cleanupArchivedSessionTranscripts({
|
||||
directories: targetDirs,
|
||||
olderThanMs: maintenance.resetArchiveRetentionMs,
|
||||
reason: "reset",
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Rotate the on-disk file if it exceeds the size threshold.
|
||||
await rotateSessionFile(storePath, maintenance.rotateBytes);
|
||||
|
||||
const diskBudget = await enforceSessionDiskBudget({
|
||||
store,
|
||||
storePath,
|
||||
activeSessionKey: opts?.activeSessionKey,
|
||||
maintenance,
|
||||
warnOnly: false,
|
||||
log,
|
||||
});
|
||||
await opts?.onMaintenanceApplied?.({
|
||||
mode: maintenance.mode,
|
||||
beforeCount,
|
||||
afterCount: Object.keys(store).length,
|
||||
pruned,
|
||||
capped,
|
||||
diskBudget,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -137,6 +137,21 @@ export type SessionMaintenanceConfig = {
|
||||
maxEntries?: number;
|
||||
/** Rotate sessions.json when it exceeds this size (e.g. "10mb"). Default: 10mb. */
|
||||
rotateBytes?: number | string;
|
||||
/**
|
||||
* Retention for archived reset transcripts (`*.reset.<timestamp>`).
|
||||
* Set `false` to disable reset-archive cleanup. Default: same as `pruneAfter` (30d).
|
||||
*/
|
||||
resetArchiveRetention?: string | number | false;
|
||||
/**
|
||||
* Optional per-agent sessions-directory disk budget (e.g. "500mb").
|
||||
* When exceeded, warn (mode=warn) or enforce oldest-first cleanup (mode=enforce).
|
||||
*/
|
||||
maxDiskBytes?: number | string;
|
||||
/**
|
||||
* Target size after disk-budget cleanup (high-water mark), e.g. "400mb".
|
||||
* Default: 80% of maxDiskBytes.
|
||||
*/
|
||||
highWaterBytes?: number | string;
|
||||
};
|
||||
|
||||
export type LoggingConfig = {
|
||||
|
||||
@@ -15,4 +15,12 @@ export type CronConfig = {
|
||||
* Default: "24h".
|
||||
*/
|
||||
sessionRetention?: string | false;
|
||||
/**
|
||||
* Run-log pruning controls for `cron/runs/<jobId>.jsonl`.
|
||||
* Defaults: `maxBytes=2_000_000`, `keepLines=2000`.
|
||||
*/
|
||||
runLog?: {
|
||||
maxBytes?: number | string;
|
||||
keepLines?: number;
|
||||
};
|
||||
};
|
||||
|
||||
40
src/config/zod-schema.cron-retention.test.ts
Normal file
40
src/config/zod-schema.cron-retention.test.ts
Normal file
@@ -0,0 +1,40 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { OpenClawSchema } from "./zod-schema.js";
|
||||
|
||||
describe("OpenClawSchema cron retention and run-log validation", () => {
|
||||
it("accepts valid cron.sessionRetention and runLog values", () => {
|
||||
expect(() =>
|
||||
OpenClawSchema.parse({
|
||||
cron: {
|
||||
sessionRetention: "1h30m",
|
||||
runLog: {
|
||||
maxBytes: "5mb",
|
||||
keepLines: 2500,
|
||||
},
|
||||
},
|
||||
}),
|
||||
).not.toThrow();
|
||||
});
|
||||
|
||||
it("rejects invalid cron.sessionRetention", () => {
|
||||
expect(() =>
|
||||
OpenClawSchema.parse({
|
||||
cron: {
|
||||
sessionRetention: "abc",
|
||||
},
|
||||
}),
|
||||
).toThrow(/sessionRetention|duration/i);
|
||||
});
|
||||
|
||||
it("rejects invalid cron.runLog.maxBytes", () => {
|
||||
expect(() =>
|
||||
OpenClawSchema.parse({
|
||||
cron: {
|
||||
runLog: {
|
||||
maxBytes: "wat",
|
||||
},
|
||||
},
|
||||
}),
|
||||
).toThrow(/runLog|maxBytes|size/i);
|
||||
});
|
||||
});
|
||||
44
src/config/zod-schema.session-maintenance-extensions.test.ts
Normal file
44
src/config/zod-schema.session-maintenance-extensions.test.ts
Normal file
@@ -0,0 +1,44 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { SessionSchema } from "./zod-schema.session.js";
|
||||
|
||||
describe("SessionSchema maintenance extensions", () => {
|
||||
it("accepts valid maintenance extensions", () => {
|
||||
expect(() =>
|
||||
SessionSchema.parse({
|
||||
maintenance: {
|
||||
resetArchiveRetention: "14d",
|
||||
maxDiskBytes: "500mb",
|
||||
highWaterBytes: "350mb",
|
||||
},
|
||||
}),
|
||||
).not.toThrow();
|
||||
});
|
||||
|
||||
it("accepts disabling reset archive cleanup", () => {
|
||||
expect(() =>
|
||||
SessionSchema.parse({
|
||||
maintenance: {
|
||||
resetArchiveRetention: false,
|
||||
},
|
||||
}),
|
||||
).not.toThrow();
|
||||
});
|
||||
|
||||
it("rejects invalid maintenance extension values", () => {
|
||||
expect(() =>
|
||||
SessionSchema.parse({
|
||||
maintenance: {
|
||||
resetArchiveRetention: "never",
|
||||
},
|
||||
}),
|
||||
).toThrow(/resetArchiveRetention|duration/i);
|
||||
|
||||
expect(() =>
|
||||
SessionSchema.parse({
|
||||
maintenance: {
|
||||
maxDiskBytes: "big",
|
||||
},
|
||||
}),
|
||||
).toThrow(/maxDiskBytes|size/i);
|
||||
});
|
||||
});
|
||||
@@ -75,6 +75,9 @@ export const SessionSchema = z
|
||||
pruneDays: z.number().int().positive().optional(),
|
||||
maxEntries: z.number().int().positive().optional(),
|
||||
rotateBytes: z.union([z.string(), z.number()]).optional(),
|
||||
resetArchiveRetention: z.union([z.string(), z.number(), z.literal(false)]).optional(),
|
||||
maxDiskBytes: z.union([z.string(), z.number()]).optional(),
|
||||
highWaterBytes: z.union([z.string(), z.number()]).optional(),
|
||||
})
|
||||
.strict()
|
||||
.superRefine((val, ctx) => {
|
||||
@@ -100,6 +103,39 @@ export const SessionSchema = z
|
||||
});
|
||||
}
|
||||
}
|
||||
if (val.resetArchiveRetention !== undefined && val.resetArchiveRetention !== false) {
|
||||
try {
|
||||
parseDurationMs(String(val.resetArchiveRetention).trim(), { defaultUnit: "d" });
|
||||
} catch {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
path: ["resetArchiveRetention"],
|
||||
message: "invalid duration (use ms, s, m, h, d)",
|
||||
});
|
||||
}
|
||||
}
|
||||
if (val.maxDiskBytes !== undefined) {
|
||||
try {
|
||||
parseByteSize(String(val.maxDiskBytes).trim(), { defaultUnit: "b" });
|
||||
} catch {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
path: ["maxDiskBytes"],
|
||||
message: "invalid size (use b, kb, mb, gb, tb)",
|
||||
});
|
||||
}
|
||||
}
|
||||
if (val.highWaterBytes !== undefined) {
|
||||
try {
|
||||
parseByteSize(String(val.highWaterBytes).trim(), { defaultUnit: "b" });
|
||||
} catch {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
path: ["highWaterBytes"],
|
||||
message: "invalid size (use b, kb, mb, gb, tb)",
|
||||
});
|
||||
}
|
||||
}
|
||||
})
|
||||
.optional(),
|
||||
})
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
import { z } from "zod";
|
||||
import { parseByteSize } from "../cli/parse-bytes.js";
|
||||
import { parseDurationMs } from "../cli/parse-duration.js";
|
||||
import { ToolsSchema } from "./zod-schema.agent-runtime.js";
|
||||
import { AgentsSchema, AudioSchema, BindingsSchema, BroadcastSchema } from "./zod-schema.agents.js";
|
||||
import { ApprovalsSchema } from "./zod-schema.approvals.js";
|
||||
@@ -324,8 +326,39 @@ export const OpenClawSchema = z
|
||||
webhook: HttpUrlSchema.optional(),
|
||||
webhookToken: z.string().optional().register(sensitive),
|
||||
sessionRetention: z.union([z.string(), z.literal(false)]).optional(),
|
||||
runLog: z
|
||||
.object({
|
||||
maxBytes: z.union([z.string(), z.number()]).optional(),
|
||||
keepLines: z.number().int().positive().optional(),
|
||||
})
|
||||
.strict()
|
||||
.optional(),
|
||||
})
|
||||
.strict()
|
||||
.superRefine((val, ctx) => {
|
||||
if (val.sessionRetention !== undefined && val.sessionRetention !== false) {
|
||||
try {
|
||||
parseDurationMs(String(val.sessionRetention).trim(), { defaultUnit: "h" });
|
||||
} catch {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
path: ["sessionRetention"],
|
||||
message: "invalid duration (use ms, s, m, h, d)",
|
||||
});
|
||||
}
|
||||
}
|
||||
if (val.runLog?.maxBytes !== undefined) {
|
||||
try {
|
||||
parseByteSize(String(val.runLog.maxBytes).trim(), { defaultUnit: "b" });
|
||||
} catch {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
path: ["runLog", "maxBytes"],
|
||||
message: "invalid size (use b, kb, mb, gb, tb)",
|
||||
});
|
||||
}
|
||||
}
|
||||
})
|
||||
.optional(),
|
||||
hooks: z
|
||||
.object({
|
||||
|
||||
@@ -4,12 +4,40 @@ import path from "node:path";
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
appendCronRunLog,
|
||||
DEFAULT_CRON_RUN_LOG_KEEP_LINES,
|
||||
DEFAULT_CRON_RUN_LOG_MAX_BYTES,
|
||||
getPendingCronRunLogWriteCountForTests,
|
||||
readCronRunLogEntries,
|
||||
resolveCronRunLogPruneOptions,
|
||||
resolveCronRunLogPath,
|
||||
} from "./run-log.js";
|
||||
|
||||
describe("cron run log", () => {
|
||||
it("resolves prune options from config with defaults", () => {
|
||||
expect(resolveCronRunLogPruneOptions()).toEqual({
|
||||
maxBytes: DEFAULT_CRON_RUN_LOG_MAX_BYTES,
|
||||
keepLines: DEFAULT_CRON_RUN_LOG_KEEP_LINES,
|
||||
});
|
||||
expect(
|
||||
resolveCronRunLogPruneOptions({
|
||||
maxBytes: "5mb",
|
||||
keepLines: 123,
|
||||
}),
|
||||
).toEqual({
|
||||
maxBytes: 5 * 1024 * 1024,
|
||||
keepLines: 123,
|
||||
});
|
||||
expect(
|
||||
resolveCronRunLogPruneOptions({
|
||||
maxBytes: "invalid",
|
||||
keepLines: -1,
|
||||
}),
|
||||
).toEqual({
|
||||
maxBytes: DEFAULT_CRON_RUN_LOG_MAX_BYTES,
|
||||
keepLines: DEFAULT_CRON_RUN_LOG_KEEP_LINES,
|
||||
});
|
||||
});
|
||||
|
||||
async function withRunLogDir(prefix: string, run: (dir: string) => Promise<void>) {
|
||||
const dir = await fs.mkdtemp(path.join(os.tmpdir(), prefix));
|
||||
try {
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
import fs from "node:fs/promises";
|
||||
import path from "node:path";
|
||||
import { parseByteSize } from "../cli/parse-bytes.js";
|
||||
import type { CronConfig } from "../config/types.cron.js";
|
||||
import type { CronDeliveryStatus, CronRunStatus, CronRunTelemetry } from "./types.js";
|
||||
|
||||
export type CronRunLogEntry = {
|
||||
@@ -73,6 +75,30 @@ export function resolveCronRunLogPath(params: { storePath: string; jobId: string
|
||||
|
||||
const writesByPath = new Map<string, Promise<void>>();
|
||||
|
||||
export const DEFAULT_CRON_RUN_LOG_MAX_BYTES = 2_000_000;
|
||||
export const DEFAULT_CRON_RUN_LOG_KEEP_LINES = 2_000;
|
||||
|
||||
export function resolveCronRunLogPruneOptions(cfg?: CronConfig["runLog"]): {
|
||||
maxBytes: number;
|
||||
keepLines: number;
|
||||
} {
|
||||
let maxBytes = DEFAULT_CRON_RUN_LOG_MAX_BYTES;
|
||||
if (cfg?.maxBytes !== undefined) {
|
||||
try {
|
||||
maxBytes = parseByteSize(String(cfg.maxBytes).trim(), { defaultUnit: "b" });
|
||||
} catch {
|
||||
maxBytes = DEFAULT_CRON_RUN_LOG_MAX_BYTES;
|
||||
}
|
||||
}
|
||||
|
||||
let keepLines = DEFAULT_CRON_RUN_LOG_KEEP_LINES;
|
||||
if (typeof cfg?.keepLines === "number" && Number.isFinite(cfg.keepLines) && cfg.keepLines > 0) {
|
||||
keepLines = Math.floor(cfg.keepLines);
|
||||
}
|
||||
|
||||
return { maxBytes, keepLines };
|
||||
}
|
||||
|
||||
export function getPendingCronRunLogWriteCountForTests() {
|
||||
return writesByPath.size;
|
||||
}
|
||||
@@ -108,8 +134,8 @@ export async function appendCronRunLog(
|
||||
await fs.mkdir(path.dirname(resolved), { recursive: true });
|
||||
await fs.appendFile(resolved, `${JSON.stringify(entry)}\n`, "utf-8");
|
||||
await pruneIfNeeded(resolved, {
|
||||
maxBytes: opts?.maxBytes ?? 2_000_000,
|
||||
keepLines: opts?.keepLines ?? 2_000,
|
||||
maxBytes: opts?.maxBytes ?? DEFAULT_CRON_RUN_LOG_MAX_BYTES,
|
||||
keepLines: opts?.keepLines ?? DEFAULT_CRON_RUN_LOG_KEEP_LINES,
|
||||
});
|
||||
});
|
||||
writesByPath.set(resolved, next);
|
||||
|
||||
@@ -109,6 +109,61 @@ describe("sweepCronRunSessions", () => {
|
||||
expect(updated["agent:main:telegram:dm:123"]).toBeDefined();
|
||||
});
|
||||
|
||||
it("archives transcript files for pruned run sessions that are no longer referenced", async () => {
|
||||
const now = Date.now();
|
||||
const runSessionId = "old-run";
|
||||
const runTranscript = path.join(tmpDir, `${runSessionId}.jsonl`);
|
||||
fs.writeFileSync(runTranscript, '{"type":"session"}\n');
|
||||
const store: Record<string, { sessionId: string; updatedAt: number }> = {
|
||||
"agent:main:cron:job1:run:old-run": {
|
||||
sessionId: runSessionId,
|
||||
updatedAt: now - 25 * 3_600_000,
|
||||
},
|
||||
};
|
||||
fs.writeFileSync(storePath, JSON.stringify(store));
|
||||
|
||||
const result = await sweepCronRunSessions({
|
||||
sessionStorePath: storePath,
|
||||
nowMs: now,
|
||||
log,
|
||||
force: true,
|
||||
});
|
||||
|
||||
expect(result.pruned).toBe(1);
|
||||
expect(fs.existsSync(runTranscript)).toBe(false);
|
||||
const files = fs.readdirSync(tmpDir);
|
||||
expect(files.some((name) => name.startsWith(`${runSessionId}.jsonl.deleted.`))).toBe(true);
|
||||
});
|
||||
|
||||
it("does not archive external transcript paths for pruned runs", async () => {
|
||||
const now = Date.now();
|
||||
const externalDir = fs.mkdtempSync(path.join(os.tmpdir(), "cron-reaper-external-"));
|
||||
const externalTranscript = path.join(externalDir, "outside.jsonl");
|
||||
fs.writeFileSync(externalTranscript, '{"type":"session"}\n');
|
||||
const store: Record<string, { sessionId: string; sessionFile?: string; updatedAt: number }> = {
|
||||
"agent:main:cron:job1:run:old-run": {
|
||||
sessionId: "old-run",
|
||||
sessionFile: externalTranscript,
|
||||
updatedAt: now - 25 * 3_600_000,
|
||||
},
|
||||
};
|
||||
fs.writeFileSync(storePath, JSON.stringify(store));
|
||||
|
||||
try {
|
||||
const result = await sweepCronRunSessions({
|
||||
sessionStorePath: storePath,
|
||||
nowMs: now,
|
||||
log,
|
||||
force: true,
|
||||
});
|
||||
|
||||
expect(result.pruned).toBe(1);
|
||||
expect(fs.existsSync(externalTranscript)).toBe(true);
|
||||
} finally {
|
||||
fs.rmSync(externalDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it("respects custom retention", async () => {
|
||||
const now = Date.now();
|
||||
const store: Record<string, { sessionId: string; updatedAt: number }> = {
|
||||
|
||||
@@ -6,9 +6,14 @@
|
||||
* run records. The base session (`...:cron:<jobId>`) is kept as-is.
|
||||
*/
|
||||
|
||||
import path from "node:path";
|
||||
import { parseDurationMs } from "../cli/parse-duration.js";
|
||||
import { updateSessionStore } from "../config/sessions.js";
|
||||
import { loadSessionStore, updateSessionStore } from "../config/sessions.js";
|
||||
import type { CronConfig } from "../config/types.cron.js";
|
||||
import {
|
||||
archiveSessionTranscripts,
|
||||
cleanupArchivedSessionTranscripts,
|
||||
} from "../gateway/session-utils.fs.js";
|
||||
import { isCronRunSessionKey } from "../sessions/session-key-utils.js";
|
||||
import type { Logger } from "./service/state.js";
|
||||
|
||||
@@ -74,6 +79,7 @@ export async function sweepCronRunSessions(params: {
|
||||
}
|
||||
|
||||
let pruned = 0;
|
||||
const prunedSessions = new Map<string, string | undefined>();
|
||||
try {
|
||||
await updateSessionStore(storePath, (store) => {
|
||||
const cutoff = now - retentionMs;
|
||||
@@ -87,6 +93,9 @@ export async function sweepCronRunSessions(params: {
|
||||
}
|
||||
const updatedAt = entry.updatedAt ?? 0;
|
||||
if (updatedAt < cutoff) {
|
||||
if (!prunedSessions.has(entry.sessionId) || entry.sessionFile) {
|
||||
prunedSessions.set(entry.sessionId, entry.sessionFile);
|
||||
}
|
||||
delete store[key];
|
||||
pruned++;
|
||||
}
|
||||
@@ -99,6 +108,43 @@ export async function sweepCronRunSessions(params: {
|
||||
|
||||
lastSweepAtMsByStore.set(storePath, now);
|
||||
|
||||
if (prunedSessions.size > 0) {
|
||||
try {
|
||||
const store = loadSessionStore(storePath, { skipCache: true });
|
||||
const referencedSessionIds = new Set(
|
||||
Object.values(store)
|
||||
.map((entry) => entry?.sessionId)
|
||||
.filter((id): id is string => Boolean(id)),
|
||||
);
|
||||
const archivedDirs = new Set<string>();
|
||||
for (const [sessionId, sessionFile] of prunedSessions) {
|
||||
if (referencedSessionIds.has(sessionId)) {
|
||||
continue;
|
||||
}
|
||||
const archived = archiveSessionTranscripts({
|
||||
sessionId,
|
||||
storePath,
|
||||
sessionFile,
|
||||
reason: "deleted",
|
||||
restrictToStoreDir: true,
|
||||
});
|
||||
for (const archivedPath of archived) {
|
||||
archivedDirs.add(path.dirname(archivedPath));
|
||||
}
|
||||
}
|
||||
if (archivedDirs.size > 0) {
|
||||
await cleanupArchivedSessionTranscripts({
|
||||
directories: [...archivedDirs],
|
||||
olderThanMs: retentionMs,
|
||||
reason: "deleted",
|
||||
nowMs: now,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
params.log.warn({ err: String(err) }, "cron-reaper: transcript cleanup failed");
|
||||
}
|
||||
}
|
||||
|
||||
if (pruned > 0) {
|
||||
params.log.info(
|
||||
{ pruned, retentionMs },
|
||||
|
||||
@@ -8,7 +8,11 @@ import {
|
||||
} from "../config/sessions.js";
|
||||
import { resolveStorePath } from "../config/sessions/paths.js";
|
||||
import { runCronIsolatedAgentTurn } from "../cron/isolated-agent.js";
|
||||
import { appendCronRunLog, resolveCronRunLogPath } from "../cron/run-log.js";
|
||||
import {
|
||||
appendCronRunLog,
|
||||
resolveCronRunLogPath,
|
||||
resolveCronRunLogPruneOptions,
|
||||
} from "../cron/run-log.js";
|
||||
import { CronService } from "../cron/service.js";
|
||||
import { resolveCronStorePath } from "../cron/store.js";
|
||||
import { normalizeHttpWebhookUrl } from "../cron/webhook-url.js";
|
||||
@@ -144,6 +148,7 @@ export function buildGatewayCronService(params: {
|
||||
};
|
||||
|
||||
const defaultAgentId = resolveDefaultAgentId(params.cfg);
|
||||
const runLogPrune = resolveCronRunLogPruneOptions(params.cfg.cron?.runLog);
|
||||
const resolveSessionStorePath = (agentId?: string) =>
|
||||
resolveStorePath(params.cfg.session?.store, {
|
||||
agentId: agentId ?? defaultAgentId,
|
||||
@@ -289,25 +294,29 @@ export function buildGatewayCronService(params: {
|
||||
storePath,
|
||||
jobId: evt.jobId,
|
||||
});
|
||||
void appendCronRunLog(logPath, {
|
||||
ts: Date.now(),
|
||||
jobId: evt.jobId,
|
||||
action: "finished",
|
||||
status: evt.status,
|
||||
error: evt.error,
|
||||
summary: evt.summary,
|
||||
delivered: evt.delivered,
|
||||
deliveryStatus: evt.deliveryStatus,
|
||||
deliveryError: evt.deliveryError,
|
||||
sessionId: evt.sessionId,
|
||||
sessionKey: evt.sessionKey,
|
||||
runAtMs: evt.runAtMs,
|
||||
durationMs: evt.durationMs,
|
||||
nextRunAtMs: evt.nextRunAtMs,
|
||||
model: evt.model,
|
||||
provider: evt.provider,
|
||||
usage: evt.usage,
|
||||
}).catch((err) => {
|
||||
void appendCronRunLog(
|
||||
logPath,
|
||||
{
|
||||
ts: Date.now(),
|
||||
jobId: evt.jobId,
|
||||
action: "finished",
|
||||
status: evt.status,
|
||||
error: evt.error,
|
||||
summary: evt.summary,
|
||||
delivered: evt.delivered,
|
||||
deliveryStatus: evt.deliveryStatus,
|
||||
deliveryError: evt.deliveryError,
|
||||
sessionId: evt.sessionId,
|
||||
sessionKey: evt.sessionKey,
|
||||
runAtMs: evt.runAtMs,
|
||||
durationMs: evt.durationMs,
|
||||
nextRunAtMs: evt.nextRunAtMs,
|
||||
model: evt.model,
|
||||
provider: evt.provider,
|
||||
usage: evt.usage,
|
||||
},
|
||||
runLogPrune,
|
||||
).catch((err) => {
|
||||
cronLogger.warn({ err: String(err), logPath }, "cron: run log append failed");
|
||||
});
|
||||
}
|
||||
|
||||
@@ -2,6 +2,9 @@ import fs from "node:fs";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
import {
|
||||
formatSessionArchiveTimestamp,
|
||||
parseSessionArchiveTimestamp,
|
||||
type SessionArchiveReason,
|
||||
resolveSessionFilePath,
|
||||
resolveSessionTranscriptPath,
|
||||
resolveSessionTranscriptPathInDir,
|
||||
@@ -159,10 +162,19 @@ export function resolveSessionTranscriptCandidates(
|
||||
return Array.from(new Set(candidates));
|
||||
}
|
||||
|
||||
export type ArchiveFileReason = "bak" | "reset" | "deleted";
|
||||
export type ArchiveFileReason = SessionArchiveReason;
|
||||
|
||||
function canonicalizePathForComparison(filePath: string): string {
|
||||
const resolved = path.resolve(filePath);
|
||||
try {
|
||||
return fs.realpathSync(resolved);
|
||||
} catch {
|
||||
return resolved;
|
||||
}
|
||||
}
|
||||
|
||||
export function archiveFileOnDisk(filePath: string, reason: ArchiveFileReason): string {
|
||||
const ts = new Date().toISOString().replaceAll(":", "-");
|
||||
const ts = formatSessionArchiveTimestamp();
|
||||
const archived = `${filePath}.${reason}.${ts}`;
|
||||
fs.renameSync(filePath, archived);
|
||||
return archived;
|
||||
@@ -178,19 +190,35 @@ export function archiveSessionTranscripts(opts: {
|
||||
sessionFile?: string;
|
||||
agentId?: string;
|
||||
reason: "reset" | "deleted";
|
||||
/**
|
||||
* When true, only archive files resolved under the session store directory.
|
||||
* This prevents maintenance operations from mutating paths outside the agent sessions dir.
|
||||
*/
|
||||
restrictToStoreDir?: boolean;
|
||||
}): string[] {
|
||||
const archived: string[] = [];
|
||||
const storeDir =
|
||||
opts.restrictToStoreDir && opts.storePath
|
||||
? canonicalizePathForComparison(path.dirname(opts.storePath))
|
||||
: null;
|
||||
for (const candidate of resolveSessionTranscriptCandidates(
|
||||
opts.sessionId,
|
||||
opts.storePath,
|
||||
opts.sessionFile,
|
||||
opts.agentId,
|
||||
)) {
|
||||
if (!fs.existsSync(candidate)) {
|
||||
const candidatePath = canonicalizePathForComparison(candidate);
|
||||
if (storeDir) {
|
||||
const relative = path.relative(storeDir, candidatePath);
|
||||
if (!relative || relative.startsWith("..") || path.isAbsolute(relative)) {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
if (!fs.existsSync(candidatePath)) {
|
||||
continue;
|
||||
}
|
||||
try {
|
||||
archived.push(archiveFileOnDisk(candidate, opts.reason));
|
||||
archived.push(archiveFileOnDisk(candidatePath, opts.reason));
|
||||
} catch {
|
||||
// Best-effort.
|
||||
}
|
||||
@@ -198,32 +226,10 @@ export function archiveSessionTranscripts(opts: {
|
||||
return archived;
|
||||
}
|
||||
|
||||
function restoreArchiveTimestamp(raw: string): string {
|
||||
const [datePart, timePart] = raw.split("T");
|
||||
if (!datePart || !timePart) {
|
||||
return raw;
|
||||
}
|
||||
return `${datePart}T${timePart.replace(/-/g, ":")}`;
|
||||
}
|
||||
|
||||
function parseArchivedTimestamp(fileName: string, reason: ArchiveFileReason): number | null {
|
||||
const marker = `.${reason}.`;
|
||||
const index = fileName.lastIndexOf(marker);
|
||||
if (index < 0) {
|
||||
return null;
|
||||
}
|
||||
const raw = fileName.slice(index + marker.length);
|
||||
if (!raw) {
|
||||
return null;
|
||||
}
|
||||
const timestamp = Date.parse(restoreArchiveTimestamp(raw));
|
||||
return Number.isNaN(timestamp) ? null : timestamp;
|
||||
}
|
||||
|
||||
export async function cleanupArchivedSessionTranscripts(opts: {
|
||||
directories: string[];
|
||||
olderThanMs: number;
|
||||
reason?: "deleted";
|
||||
reason?: ArchiveFileReason;
|
||||
nowMs?: number;
|
||||
}): Promise<{ removed: number; scanned: number }> {
|
||||
if (!Number.isFinite(opts.olderThanMs) || opts.olderThanMs < 0) {
|
||||
@@ -238,7 +244,7 @@ export async function cleanupArchivedSessionTranscripts(opts: {
|
||||
for (const dir of directories) {
|
||||
const entries = await fs.promises.readdir(dir).catch(() => []);
|
||||
for (const entry of entries) {
|
||||
const timestamp = parseArchivedTimestamp(entry, reason);
|
||||
const timestamp = parseSessionArchiveTimestamp(entry, reason);
|
||||
if (timestamp == null) {
|
||||
continue;
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user