fix: cap Anthropic max_tokens to 16384 for Pass 0b batches

Previous formula (batch_size * 1500) exceeded Claude's 16K output limit
for batch_size > 10, causing API failures and Ollama fallback.
New formula: min(16384, max(4096, batch_size * 500))

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Benjamin Admin
2026-03-23 08:50:45 +01:00
parent ac6134ce6d
commit bdd2f6fa0f

View File

@@ -1101,7 +1101,7 @@ class DecompositionPass:
llm_response = await _llm_anthropic( llm_response = await _llm_anthropic(
prompt=prompt, prompt=prompt,
system_prompt=_PASS0B_SYSTEM_PROMPT, system_prompt=_PASS0B_SYSTEM_PROMPT,
max_tokens=max(8192, len(batch) * 1500), max_tokens=min(16384, max(4096, len(batch) * 500)),
) )
stats["llm_calls"] += 1 stats["llm_calls"] += 1
results_by_id = _parse_json_object(llm_response) results_by_id = _parse_json_object(llm_response)