mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 18:00:36 +00:00
# What does this PR do? Automates the model list check by querying the distro. Added support for both remote hosted and templates. ## Test Plan Run on a remote hosted distro via `LLAMA_STACK_BASE_URL="https://llamastack-preview.fireworks.ai" pytest -s -v tests/client-sdk --report` Run on a template via `LLAMA_STACK_CONFIG=fireworks pytest -s -v tests/client-sdk --report` |
||
|---|---|---|
| .. | ||
| agents | ||
| inference | ||
| safety | ||
| tool_runtime | ||
| vector_io | ||
| __init__.py | ||
| conftest.py | ||
| metadata.py | ||
| report.py | ||