mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-27 18:50:41 +00:00
With https://github.com/meta-llama/llama-stack-client-python/pull/226, now we have llama-stack-client be able to used as a substitute for OpenAI client (duck-typed) so you don't need to change downstream library code. <img width="1399" alt="image" src="https://github.com/user-attachments/assets/abab6bfd-e6ff-4a7d-a965-fd93e3c105d7" /> |
||
---|---|---|
.. | ||
client-sdk/post_training | ||
external-provider/llama-stack-provider-ollama | ||
integration | ||
unit | ||
verifications | ||
__init__.py |