fix(usage): clamp negative input token counts to zero

Some OpenAI-format providers (via pi-ai) pre-subtract cached_tokens from
prompt_tokens upstream.  When cached_tokens exceeds prompt_tokens due to
provider inconsistencies the subtraction produces a negative input value
that flows through to the TUI status bar and /usage dashboard.

Clamp rawInput to 0 in normalizeUsage() so downstream consumers never
see nonsensical negative token counts.

Closes #30765

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
scoootscooob
2026-03-01 19:07:14 -08:00
committed by Peter Steinberger
parent 08c35eb13f
commit 20467d987d
2 changed files with 22 additions and 1 deletions

View File

@@ -88,6 +88,23 @@ describe("normalizeUsage", () => {
});
});
it("clamps negative input to zero (pre-subtracted cached_tokens > prompt_tokens)", () => {
// pi-ai OpenAI-format providers subtract cached_tokens from prompt_tokens
// upstream. When cached_tokens exceeds prompt_tokens the result is negative.
const usage = normalizeUsage({
input: -4900,
output: 200,
cacheRead: 5000,
});
expect(usage).toEqual({
input: 0,
output: 200,
cacheRead: 5000,
cacheWrite: undefined,
total: undefined,
});
});
it("returns undefined when no valid fields are provided", () => {
const usage = normalizeUsage(null);
expect(usage).toBeUndefined();