llama-stack/tests/unit/providers
Derek Higgins c8797f1125
fix: Including tool call in chat (#1931)
Include the tool call details with the chat when doing Rag with Remote
vllm

Fixes: #1929

With this PR the tool call is included in the chat returned to vllm, the
model (meta-llama/Llama-3.1-8B-Instruct) the returns the answer as
expected.

Signed-off-by: Derek Higgins <derekh@redhat.com>
2025-04-24 16:59:10 -07:00
..
agents feat: make sure agent sessions are under access control (#1737) 2025-03-21 07:31:16 -07:00
inference fix: Including tool call in chat (#1931) 2025-04-24 16:59:10 -07:00
nvidia fix: Handle case when Customizer Job status is unknown (#1965) 2025-04-17 10:27:07 +02:00
utils fix: Including tool call in chat (#1931) 2025-04-24 16:59:10 -07:00
vector_io chore: Updating sqlite-vec to make non-blocking calls (#1762) 2025-03-23 17:25:44 -07:00
test_configs.py feat(api): don't return a payload on file delete (#1640) 2025-03-25 17:12:36 -07:00