feat: File search tool for Responses API

This is an initial working prototype of wiring up the `file_search` builtin
tool for the Responses API to our existing rag knowledge search tool.

I stubbed in a new test (that uses a hardcoded url hybrid of the OpenAI
and Llama Stack clients for now, only until we finish landing the vector
store APIs and insertion support).

Note that this is currently under tests/verification only because it
sometimes flakes with tool calling of the small Llama-3.2-3B model we run
in CI (and that I use as an example below). We'd want to make the test a
bit more robust in some way if we moved this over to tests/integration and
ran it in CI.

```
ollama run llama3.2:3b

INFERENCE_MODEL="meta-llama/Llama-3.2-3B-Instruct" \
llama stack run ./llama_stack/templates/ollama/run.yaml \
  --image-type venv \
  --env OLLAMA_URL="http://0.0.0.0:11434"

pytest -sv 'tests/verifications/openai_api/test_responses.py::test_response_non_streaming_file_search' \
  --base-url=http://localhost:8321/v1/openai/v1 \
  --model meta-llama/Llama-3.2-3B-Instruct
```

Signed-off-by: Ben Browning <bbrownin@redhat.com>
This commit is contained in:
Ben Browning 2025-06-10 12:42:09 -04:00
parent e2e15ebb6c
commit fa34468308
7 changed files with 234 additions and 11 deletions

View file

@ -7047,6 +7047,9 @@
{
"$ref": "#/components/schemas/OpenAIResponseOutputMessageWebSearchToolCall"
},
{
"$ref": "#/components/schemas/OpenAIResponseOutputMessageFileSearchToolCall"
},
{
"$ref": "#/components/schemas/OpenAIResponseOutputMessageFunctionToolCall"
},
@ -7193,7 +7196,7 @@
"const": "file_search",
"default": "file_search"
},
"vector_store_id": {
"vector_store_ids": {
"type": "array",
"items": {
"type": "string"
@ -7217,7 +7220,7 @@
"additionalProperties": false,
"required": [
"type",
"vector_store_id"
"vector_store_ids"
],
"title": "OpenAIResponseInputToolFileSearch"
},
@ -7484,6 +7487,64 @@
],
"title": "OpenAIResponseOutputMessageContentOutputText"
},
"OpenAIResponseOutputMessageFileSearchToolCall": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"queries": {
"type": "array",
"items": {
"type": "string"
}
},
"status": {
"type": "string"
},
"type": {
"type": "string",
"const": "file_search_call",
"default": "file_search_call"
},
"results": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": {
"oneOf": [
{
"type": "null"
},
{
"type": "boolean"
},
{
"type": "number"
},
{
"type": "string"
},
{
"type": "array"
},
{
"type": "object"
}
]
}
}
}
},
"additionalProperties": false,
"required": [
"id",
"queries",
"status",
"type"
],
"title": "OpenAIResponseOutputMessageFileSearchToolCall"
},
"OpenAIResponseOutputMessageFunctionToolCall": {
"type": "object",
"properties": {
@ -7760,6 +7821,9 @@
{
"$ref": "#/components/schemas/OpenAIResponseOutputMessageWebSearchToolCall"
},
{
"$ref": "#/components/schemas/OpenAIResponseOutputMessageFileSearchToolCall"
},
{
"$ref": "#/components/schemas/OpenAIResponseOutputMessageFunctionToolCall"
},
@ -7775,6 +7839,7 @@
"mapping": {
"message": "#/components/schemas/OpenAIResponseMessage",
"web_search_call": "#/components/schemas/OpenAIResponseOutputMessageWebSearchToolCall",
"file_search_call": "#/components/schemas/OpenAIResponseOutputMessageFileSearchToolCall",
"function_call": "#/components/schemas/OpenAIResponseOutputMessageFunctionToolCall",
"mcp_call": "#/components/schemas/OpenAIResponseOutputMessageMCPCall",
"mcp_list_tools": "#/components/schemas/OpenAIResponseOutputMessageMCPListTools"