llama-stack-mirror/tests
Derek Higgins c8797f1125
fix: Including tool call in chat (#1931)
Include the tool call details with the chat when doing Rag with Remote
vllm

Fixes: #1929

With this PR the tool call is included in the chat returned to vllm, the
model (meta-llama/Llama-3.1-8B-Instruct) the returns the answer as
expected.

Signed-off-by: Derek Higgins <derekh@redhat.com>
2025-04-24 16:59:10 -07:00
..
client-sdk/post_training feat: Add nemo customizer (#1448) 2025-03-25 11:01:10 -07:00
external-provider/llama-stack-provider-ollama feat: allow building distro with external providers (#1967) 2025-04-18 17:18:28 +02:00
integration feat(agents): add agent naming functionality (#1922) 2025-04-17 07:02:47 -07:00
unit fix: Including tool call in chat (#1931) 2025-04-24 16:59:10 -07:00
verifications fix: Return HTTP 400 for OpenAI API validation errors (#2002) 2025-04-23 17:48:32 +02:00
__init__.py refactor(test): introduce --stack-config and simplify options (#1404) 2025-03-05 17:02:02 -08:00