mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-27 18:50:41 +00:00
Stainless ends up reformatting the YAML when we paste it in the Studio. We cannot have that happen if we are going to ever partially automate stainless config updates. Try ruamel.yaml, specifically `block_seq_indent` to avoid that. |
||
---|---|---|
.. | ||
pyopenapi | ||
strong_typing | ||
generate.py | ||
README.md | ||
run_openapi_generator.sh |
The RFC Specification (OpenAPI format) is generated from the set of API endpoints located in llama_stack/[<subdir>]/api/endpoints.py
using the generate.py
utility.
Please install the following packages before running the script:
pip install python-openapi json-strong-typing fire PyYAML llama-models
Then simply run sh run_openapi_generator.sh <OUTPUT_DIR>