mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 18:00:36 +00:00
Well our Responses tests use it so we better include it in the API, no? I discovered it because I want to make sure `llama-stack-client` can be used always instead of `openai-python` as the client (we do want to be _truly_ compatible.) |
||
|---|---|---|
| .. | ||
| css | ||
| js | ||
| providers/vector_io | ||
| llama-stack-logo.png | ||
| llama-stack-spec.html | ||
| llama-stack-spec.yaml | ||
| llama-stack.png | ||
| remote_or_local.gif | ||
| safety_system.webp | ||