llama-stack-mirror/client-sdks/stainless
Jaideep Rao 56f946f3f5
feat: add support for tool_choice to responses api (#4106)
# What does this PR do?
Adds support for enforcing tool usage via responses api. See
https://platform.openai.com/docs/api-reference/responses/create#responses_create-tool_choice
for details from official documentation.
Note: at present this PR only supports `file_search` and `web_search` as
options to enforce builtin tool usage

<!-- If resolving an issue, uncomment and update the line below -->
Closes #3548 

## Test Plan
<!-- Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.* -->
`./scripts/unit-tests.sh
tests/unit/providers/agents/meta_reference/test_response_tool_context.py
`

---------

Signed-off-by: Jaideep Rao <jrao@redhat.com>
2025-12-15 11:22:06 -08:00
..
config.yml feat(openapi): generate stainless config "more" programmatically (#4164) 2025-11-17 12:48:03 -08:00
openapi.yml feat: add support for tool_choice to responses api (#4106) 2025-12-15 11:22:06 -08:00
README.md feat(openapi): generate stainless config "more" programmatically (#4164) 2025-11-17 12:48:03 -08:00

These are the source-of-truth configuration files used to generate the Stainless client SDKs via Stainless.

  • openapi.yml: this is the OpenAPI specification for the Llama Stack API.
  • config.yml: this is the Stainless configuration which instructs Stainless how to generate the client SDKs.

A small side note: notice the .yml suffixes since Stainless uses that suffix typically for its configuration files.

These files go hand-in-hand. Both openapi.yml and config.yml are generated by scripts/run_openapi_generator.sh:

  • openapi.yml comes from the FastAPI-based generator.
  • config.yml is rendered from scripts/openapi_generator/stainless_config/config_data.py so the Stainless config stays in lock-step with the spec.