llama-stack/tests/integration/agents
ehhuang 1d46f3102e
fix: enable test_responses_store (#2290)
# What does this PR do?
Changed the test to not require tool_call in output, but still keeping
the tools params there as a smoke test.

## Test Plan
Used llama3.3 from fireworks (same as CI)
<img width="1433" alt="image"
src="https://github.com/user-attachments/assets/1e5fca98-9b4f-402e-a0bc-d9f910f2c207"
/>

Run with ollama distro and 3b model.
2025-05-27 15:37:28 -07:00
..
__init__.py fix: remove ruff N999 (#1388) 2025-03-07 11:14:04 -08:00
test_agents.py fix: multiple tool calls in remote-vllm chat_completion (#2161) 2025-05-15 11:23:29 -07:00
test_openai_responses.py fix: enable test_responses_store (#2290) 2025-05-27 15:37:28 -07:00
test_persistence.py refactor(tests): delete inference, safety and agents tests from providers/tests/ (#1393) 2025-03-04 10:41:57 -08:00