mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 01:48:05 +00:00
according to gemini docs: https://ai.google.dev/gemini-api/docs/thought-signatures thought_signature lives in the extra_content of the tool_call, add extra_content to OpenAIChatCompletionToolCall to support this Signed-off-by: Charlie Doern <cdoern@redhat.com> |
||
|---|---|---|
| .. | ||
| config.yml | ||
| openapi.yml | ||
| README.md | ||
These are the source-of-truth configuration files used to generate the Stainless client SDKs via Stainless.
openapi.yml: this is the OpenAPI specification for the Llama Stack API.config.yml: this is the Stainless configuration which instructs Stainless how to generate the client SDKs.
A small side note: notice the .yml suffixes since Stainless uses that suffix typically for its configuration files.
These files go hand-in-hand. Both openapi.yml and config.yml are generated by scripts/run_openapi_generator.sh:
openapi.ymlcomes from the FastAPI-based generator.config.ymlis rendered fromscripts/openapi_generator/stainless_config/config_data.pyso the Stainless config stays in lock-step with the spec.