mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-16 22:32:38 +00:00
feat(responses)!: improve responses + conversations implementations
This PR updates the Conversation item related types and improves a couple critical parts of the implemenation: - it creates a streaming output item for the final assistant message output by the model. until now we only added content parts and included that message in the final response. - rewrites the conversation update code completely to account for items other than messages (tool calls, outputs, etc.)
This commit is contained in:
parent
d875e427bf
commit
d47f2c0ba8
11 changed files with 511 additions and 441 deletions
6
docs/static/deprecated-llama-stack-spec.html
vendored
6
docs/static/deprecated-llama-stack-spec.html
vendored
|
|
@ -8523,6 +8523,12 @@
|
|||
{
|
||||
"$ref": "#/components/schemas/OpenAIResponseMCPApprovalResponse"
|
||||
},
|
||||
{
|
||||
"$ref": "#/components/schemas/OpenAIResponseOutputMessageMCPCall"
|
||||
},
|
||||
{
|
||||
"$ref": "#/components/schemas/OpenAIResponseOutputMessageMCPListTools"
|
||||
},
|
||||
{
|
||||
"$ref": "#/components/schemas/OpenAIResponseMessage"
|
||||
}
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue