Expand file types tested with file_search

This expands the file types tested with file_search to include Word
documents (.docx), Markdown (.md), text (.txt), PDF (.pdf), and
PowerPoint (.pptx) files.

Python's mimetypes library doesn't actually recognize markdown docs as
text, so we have to handle that case specifically instead of relying
on mimetypes to get it right.

Signed-off-by: Ben Browning <bbrownin@redhat.com>
This commit is contained in:
Ben Browning 2025-06-21 09:31:38 -04:00
parent 0f7d487dca
commit 1485f3bb4c
7 changed files with 90 additions and 5 deletions

View file

@ -42,12 +42,40 @@ test_response_file_search:
# vector_store_ids param for file_search tool gets added by the test runner
file_content: "Llama 4 Maverick has 128 experts"
output: "128"
- case_id: "llama_experts_docx"
input: "How many experts does the Llama 4 Maverick model have?"
tools:
- type: file_search
# vector_store_ids param for file_search toolgets added by the test runner
file_path: "docs/llama_stack_and_models.docx"
output: "128"
- case_id: "llama_experts_md"
input: "How many experts does the Llama 4 Maverick model have?"
tools:
- type: file_search
# vector_store_ids param for file_search toolgets added by the test runner
file_path: "docs/llama_stack_and_models.md"
output: "128"
- case_id: "llama_experts_pdf"
input: "How many experts does the Llama 4 Maverick model have?"
tools:
- type: file_search
# vector_store_ids param for file_search toolgets added by the test runner
file_path: "pdfs/llama_stack_and_models.pdf"
file_path: "docs/llama_stack_and_models.pdf"
output: "128"
- case_id: "llama_experts_pptx"
input: "How many experts does the Llama 4 Maverick model have?"
tools:
- type: file_search
# vector_store_ids param for file_search toolgets added by the test runner
file_path: "docs/llama_stack_and_models.pptx"
output: "128"
- case_id: "llama_experts_txt"
input: "How many experts does the Llama 4 Maverick model have?"
tools:
- type: file_search
# vector_store_ids param for file_search toolgets added by the test runner
file_path: "docs/llama_stack_and_models.txt"
output: "128"
test_response_mcp_tool: