llama-stack-mirror/tests/integration/agents
Luis Tomas Bolivar f18b5eb537
fix: Avoid BadRequestError due to invalid max_tokens (#3667)
This patch ensures if max tokens is not defined, then is set to None
instead of 0 when calling openai_chat_completion. This way some
providers (like gemini) that cannot handle the `max_tokens = 0` will not
fail

Issue: #3666
2025-10-27 09:27:21 -07:00
..
recordings fix: Avoid BadRequestError due to invalid max_tokens (#3667) 2025-10-27 09:27:21 -07:00
__init__.py fix: remove ruff N999 (#1388) 2025-03-07 11:14:04 -08:00
test_agents.py chore: update agent call (#3824) 2025-10-16 16:04:43 +02:00
test_openai_responses.py feat: Add instructions parameter in response object (#3741) 2025-10-20 13:10:37 -07:00
test_persistence.py chore: default to pytest asyncio-mode=auto (#2730) 2025-07-11 13:00:24 -07:00