mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 18:00:36 +00:00
ClientVersion: We don't need each SDK method to support this parameter because you wouldn't be passing a different client version each time you make an API call. ProviderData: although in this case, you _could_ be passing different API keys depending on which SDK call you make, it makes for a confusing experience. It is best to initialize the LlamaStackClient with all the keys which are then passed in each request. |
||
|---|---|---|
| .. | ||
| pyopenapi | ||
| strong_typing | ||
| generate.py | ||
| README.md | ||
| run_openapi_generator.sh | ||
The RFC Specification (OpenAPI format) is generated from the set of API endpoints located in llama_stack/[<subdir>]/api/endpoints.py using the generate.py utility.
Please install the following packages before running the script:
pip install python-openapi json-strong-typing fire PyYAML llama-models
Then simply run sh run_openapi_generator.sh <OUTPUT_DIR>