llama-stack-mirror/tests/integration/agents
Abhishek Bongale bb58da22a1 feat: Add max_output_tokens to Response API
Responses and Completions have a max_output_tokens field. It is currently
missing from the create and response object in Responses API.

This PR fixes it.

fixes: #3562
Signed-off-by: Abhishek Bongale <abhishekbongale@outlook.com>
2025-10-06 09:52:25 +01:00
..
recordings feat(tests): implement test isolation for inference recordings (#3681) 2025-10-04 11:34:18 -07:00
__init__.py fix: remove ruff N999 (#1388) 2025-03-07 11:14:04 -08:00
test_agents.py chore: fix agents tests for non-ollama providers, provide max_tokens (#3657) 2025-10-02 14:30:13 -04:00
test_openai_responses.py feat: Add max_output_tokens to Response API 2025-10-06 09:52:25 +01:00
test_persistence.py chore: default to pytest asyncio-mode=auto (#2730) 2025-07-11 13:00:24 -07:00