mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-21 01:52:25 +00:00
# What does this PR do? Adds support for enforcing tool usage via responses api. See https://platform.openai.com/docs/api-reference/responses/create#responses_create-tool_choice for details from official documentation. Note: at present this PR only supports `file_search` and `web_search` as options to enforce builtin tool usage <!-- If resolving an issue, uncomment and update the line below --> Closes #3548 ## Test Plan <!-- Describe the tests you ran to verify your changes with result summaries. *Provide clear instructions so the plan can be easily re-executed.* --> `./scripts/unit-tests.sh tests/unit/providers/agents/meta_reference/test_response_tool_context.py ` --------- Signed-off-by: Jaideep Rao <jrao@redhat.com> |
||
|---|---|---|
| .. | ||
| config.yml | ||
| openapi.yml | ||
| README.md | ||
These are the source-of-truth configuration files used to generate the Stainless client SDKs via Stainless.
openapi.yml: this is the OpenAPI specification for the Llama Stack API.config.yml: this is the Stainless configuration which instructs Stainless how to generate the client SDKs.
A small side note: notice the .yml suffixes since Stainless uses that suffix typically for its configuration files.
These files go hand-in-hand. Both openapi.yml and config.yml are generated by scripts/run_openapi_generator.sh:
openapi.ymlcomes from the FastAPI-based generator.config.ymlis rendered fromscripts/openapi_generator/stainless_config/config_data.pyso the Stainless config stays in lock-step with the spec.