mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-16 01:53:10 +00:00
This wires up the Files API optional dependency into sqlite_vec and adds the localfs Files provider to our starter template, so that Responses API file_search tool works out of the box for sqlite_vec in that template. Some additional testing with this provider plus some other inference models led me to loosen the verification test results checking a bit - not for the tool call, but just around the assistant response with the file_search tool call. Some providers, such as OpenAI SaaS, make multiple tool calls to resolve the query sometimes, especially when it cannot find an answer so tries a few permutations before returning empty results to the user in that test. Signed-off-by: Ben Browning <bbrownin@redhat.com> |
||
---|---|---|
.. | ||
agents | ||
datasetio | ||
eval | ||
files/localfs | ||
inference | ||
ios/inference | ||
post_training | ||
safety | ||
scoring | ||
telemetry | ||
tool_runtime | ||
vector_io | ||
__init__.py |