fix(ollama): remove Ollama from isReasoningTagProvider (#2279)

Ollama's OpenAI-compatible endpoint handles reasoning natively via the
`reasoning` field in streaming chunks. Treating Ollama as a
reasoning-tag provider incorrectly forces <think>/<final> tag
enforcement, which causes stripBlockTags() to discard all output
(since Ollama models don't emit <final> tags), resulting in
'(no output)' for every Ollama model.

This fix removes 'ollama' from the isReasoningTagProvider() check,
allowing Ollama models to work correctly through the standard
content/reasoning field separation.
This commit is contained in:
Glucksberg
2026-02-14 12:49:01 +00:00
committed by Peter Steinberger
parent c76288bdf1
commit 74193ff754
2 changed files with 45 additions and 6 deletions

View File

@@ -13,12 +13,12 @@ export function isReasoningTagProvider(provider: string | undefined | null): boo
}
const normalized = provider.trim().toLowerCase();
// Check for exact matches or known prefixes/substrings for reasoning providers
if (
normalized === "ollama" ||
normalized === "google-gemini-cli" ||
normalized === "google-generative-ai"
) {
// Check for exact matches or known prefixes/substrings for reasoning providers.
// Note: Ollama is intentionally excluded — its OpenAI-compatible endpoint
// handles reasoning natively via the `reasoning` field in streaming chunks,
// so tag-based enforcement is unnecessary and causes all output to be
// discarded as "(no output)" (#2279).
if (normalized === "google-gemini-cli" || normalized === "google-generative-ai") {
return true;
}