build: add missing dependencies for unit tests

Added the required dependencies to ensure unit tests can run
successfully. Without these dependencies, the following command would
fail due to missing imports:

```
uv run pytest -v -k "ollama" \
     --inference-model=llama3.2:3b-instruct-fp16
     llama_stack/providers/tests/inference/test_model_registration.py
```

Refactor `pyproject.toml` by moving test-specific dependencies into a
new `[test]` group. Dependencies like `torch` and `torchvision` only
pull the CPU-only packages.
Developers can now install both development and test dependencies with:

```
uv sync --extra dev --extra test
```

Signed-off-by: Sébastien Han <seb@redhat.com>
This commit is contained in:
Sébastien Han 2025-02-07 12:06:00 +01:00
parent a66b4c4c81
commit 2db949d38b
No known key found for this signature in database
4 changed files with 846 additions and 263 deletions

View file

@ -12,6 +12,20 @@ We use `pytest` and all of its dynamism to enable the features needed. Specifica
- We use `pytest_collection_modifyitems` to filter tests based on the test config (if specified).
## Pre-requisites
Your development environment should have been configured as per the instructions in the
[CONTRIBUTING.md](../../../CONTRIBUTING.md) file. In particular, make sure to install the test extra
dependencies. Below is the full configuration:
```bash
$ cd llama-stack
$ uv sync --extra dev --extra test
$ uv pip install -e .
$ source .venv/bin/activate
```
## Common options
All tests support a `--providers` option which can be a string of the form `api1=provider_fixture1,api2=provider_fixture2`. So, when testing safety (which need inference and safety APIs) you can use `--providers inference=together,safety=meta_reference` to use these fixtures in concert.
@ -50,6 +64,9 @@ pytest -s -v llama_stack/providers/tests/inference/test_text_inference.py \
--env FIREWORKS_API_KEY=<...>
```
> [!TIP]
> If youre using `uv`, you can isolate test executions by prefixing all commands with `uv run pytest...`.
## Agents
The Agents API composes three other APIs underneath:

View file

@ -41,6 +41,7 @@ dependencies = [
dev = [
"pytest",
"pytest-asyncio",
"pytest-html",
"nbval", # For notebook testing
"black",
"ruff",
@ -49,6 +50,19 @@ dev = [
"pre-commit",
"uvicorn",
"fastapi",
"ruamel.yaml", # needed for openapi generator
]
test = [
"openai",
"aiosqlite",
"ollama",
"torch>=2.6.0",
"fairscale>=0.4.13",
"torchvision>=0.21.0",
"lm-format-enforcer>=0.10.9",
"groq",
"opentelemetry-sdk",
"opentelemetry-exporter-otlp-proto-http",
]
docs = [
"sphinx-autobuild",
@ -78,6 +92,10 @@ name = "pytorch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
[tool.uv.sources]
torch = [{ index = "pytorch-cpu" }]
torchvision = [{ index = "pytorch-cpu" }]
[tool.ruff]
line-length = 120
exclude = [

View file

@ -12,7 +12,7 @@ distro==1.9.0
exceptiongroup==1.2.2 ; python_full_version < '3.11'
filelock==3.17.0
fire==0.7.0
fsspec==2024.12.0
fsspec==2025.2.0
h11==0.14.0
httpcore==1.0.7
httpx==0.28.1
@ -23,11 +23,11 @@ jsonschema==4.23.0
jsonschema-specifications==2024.10.1
llama-models==0.1.3
llama-stack-client==0.1.3
lxml==5.3.0
lxml==5.3.1
markdown-it-py==3.0.0
markupsafe==3.0.2
mdurl==0.1.2
numpy==2.2.2
numpy==2.2.3
packaging==24.2
pandas==2.2.3
pillow==11.1.0
@ -50,7 +50,7 @@ setuptools==75.8.0
six==1.17.0
sniffio==1.3.1
termcolor==2.5.0
tiktoken==0.8.0
tiktoken==0.9.0
tqdm==4.67.1
typing-extensions==4.12.2
tzdata==2025.1

1066
uv.lock generated

File diff suppressed because it is too large Load diff