llama-stack-mirror/client-sdks/stainless
Ian Miller 5598f61e12
feat(responses)!: introduce OpenAI compatible prompts to Responses API (#3942)
# What does this PR do?
<!-- Provide a short summary of what this PR does and why. Link to
relevant issues if applicable. -->
This PR is responsible for making changes to Responses API scheme to
introduce OpenAI compatible prompts there. Change to the API only,
therefore currently no implementation at all. However, the follow up PR
with actual implementation will be submitted after current PR lands.

The need of this functionality was initiated in #3514. 

> Note, #3514 is divided on three separate PRs. Current PR is the second
of three.

<!-- If resolving an issue, uncomment and update the line below -->
<!-- Closes #[issue-number] -->

## Test Plan
<!-- Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.* -->
CI
2025-10-28 09:31:27 -07:00
..
openapi.stainless.yml chore: add beta group to stainless (#3866) 2025-10-20 16:26:06 -07:00
openapi.yml feat(responses)!: introduce OpenAI compatible prompts to Responses API (#3942) 2025-10-28 09:31:27 -07:00
README.md feat(stainless): add stainless source of truth config (#3860) 2025-10-20 14:32:20 -07:00

These are the source-of-truth configuration files used to generate the Stainless client SDKs via Stainless.

  • openapi.yml: this is the OpenAPI specification for the Llama Stack API.
  • openapi.stainless.yml: this is the Stainless configuration which instructs Stainless how to generate the client SDKs.

A small side note: notice the .yml suffixes since Stainless uses that suffix typically for its configuration files.

These files go hand-in-hand. As of now, only the openapi.yml file is automatically generated using the run_openapi_generator.sh script.