mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-17 07:07:19 +00:00
This PR updates the Conversation item related types and improves a couple critical parts of the implemenation: - it creates a streaming output item for the final assistant message output by the model. until now we only added content parts and included that message in the final response. - rewrites the conversation update code completely to account for items other than messages (tool calls, outputs, etc.) ## Test Plan Used the test script from https://github.com/llamastack/llama-stack-client-python/pull/281 for this ``` TEST_API_BASE_URL=http://localhost:8321/v1 \ pytest tests/integration/test_agent_turn_step_events.py::test_client_side_function_tool -xvs ``` |
||
---|---|---|
.. | ||
img | ||
providers/vector_io | ||
deprecated-llama-stack-spec.html | ||
deprecated-llama-stack-spec.yaml | ||
experimental-llama-stack-spec.html | ||
experimental-llama-stack-spec.yaml | ||
llama-stack-spec.html | ||
llama-stack-spec.yaml | ||
remote_or_local.gif | ||
safety_system.webp | ||
site.webmanifest | ||
stainless-llama-stack-spec.html | ||
stainless-llama-stack-spec.yaml |