mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-03 19:57:35 +00:00
Extend OpenAIResponseInput with MCP types
When posting chained resposnes with MCP tools, llamastack fails to validate OpenAIResponseInput. Signed-off-by: Yuval Turgeman <yturgema@redhat.com>
This commit is contained in:
parent
4c2fcb6b51
commit
3baca53eba
1 changed files with 2 additions and 0 deletions
|
@ -724,6 +724,8 @@ OpenAIResponseInput = Annotated[
|
|||
OpenAIResponseOutputMessageWebSearchToolCall
|
||||
| OpenAIResponseOutputMessageFileSearchToolCall
|
||||
| OpenAIResponseOutputMessageFunctionToolCall
|
||||
| OpenAIResponseOutputMessageMCPCall
|
||||
| OpenAIResponseOutputMessageMCPListTools
|
||||
| OpenAIResponseInputFunctionToolCallOutput
|
||||
|
|
||||
# Fallback to the generic message type as a last resort
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue