mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-04 20:14:13 +00:00
Extend OpenAIResponseInput with MCP types
When posting chained resposnes with MCP tools, llamastack fails to validate OpenAIResponseInput. Signed-off-by: Yuval Turgeman <yturgema@redhat.com>
This commit is contained in:
parent
4c2fcb6b51
commit
3baca53eba
1 changed files with 2 additions and 0 deletions
|
@ -724,6 +724,8 @@ OpenAIResponseInput = Annotated[
|
||||||
OpenAIResponseOutputMessageWebSearchToolCall
|
OpenAIResponseOutputMessageWebSearchToolCall
|
||||||
| OpenAIResponseOutputMessageFileSearchToolCall
|
| OpenAIResponseOutputMessageFileSearchToolCall
|
||||||
| OpenAIResponseOutputMessageFunctionToolCall
|
| OpenAIResponseOutputMessageFunctionToolCall
|
||||||
|
| OpenAIResponseOutputMessageMCPCall
|
||||||
|
| OpenAIResponseOutputMessageMCPListTools
|
||||||
| OpenAIResponseInputFunctionToolCallOutput
|
| OpenAIResponseInputFunctionToolCallOutput
|
||||||
|
|
|
|
||||||
# Fallback to the generic message type as a last resort
|
# Fallback to the generic message type as a last resort
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue