mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-29 03:14:19 +00:00
This adds passing of max_num_results from the file_search tool call into the knowledge_search tool, as well as logs warnings if the filters or ranking_options params are used since those are not wired up yet. And, it adds the API surface for filters and ranking options so we don't have to generate clients again as we add that. Signed-off-by: Ben Browning <bbrownin@redhat.com> |
||
---|---|---|
.. | ||
__init__.py | ||
agents.py | ||
openai_responses.py |