llama-stack-mirror/client-sdks/stainless
Abhishek Bongale 97b345b3f8 feat: Add max_output_tokens to Response API
OpenAI Responses and Completions have a max_output_tokens field.
It is currently missing from the create and response object in Responses API.

This PR fixes it.

fixes: #3562
Signed-off-by: Abhishek Bongale <abhishekbongale@outlook.com>
2025-11-07 13:28:59 +00:00
..
config-not-source-of-truth-yet.yml chore: update stainless config (#4096) 2025-11-06 15:58:13 -05:00
openapi.yml feat: Add max_output_tokens to Response API 2025-11-07 13:28:59 +00:00
README.md feat(stainless): add stainless source of truth config (#3860) 2025-10-20 14:32:20 -07:00

These are the source-of-truth configuration files used to generate the Stainless client SDKs via Stainless.

  • openapi.yml: this is the OpenAPI specification for the Llama Stack API.
  • openapi.stainless.yml: this is the Stainless configuration which instructs Stainless how to generate the client SDKs.

A small side note: notice the .yml suffixes since Stainless uses that suffix typically for its configuration files.

These files go hand-in-hand. As of now, only the openapi.yml file is automatically generated using the run_openapi_generator.sh script.