mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-12 12:06:04 +00:00
qualify embedding model for vector-io tests?
This commit is contained in:
parent
ce7aa53935
commit
ea0d342c5d
1 changed files with 1 additions and 1 deletions
|
|
@ -170,7 +170,7 @@ jobs:
|
|||
uv run --no-sync \
|
||||
pytest -sv --stack-config="files=inline::localfs,inference=inline::sentence-transformers,vector_io=${{ matrix.vector-io-provider }}" \
|
||||
tests/integration/vector_io \
|
||||
--embedding-model nomic-ai/nomic-embed-text-v1.5 \
|
||||
--embedding-model inline::sentence-transformers/nomic-ai/nomic-embed-text-v1.5 \
|
||||
--embedding-dimension 768
|
||||
|
||||
- name: Check Storage and Memory Available After Tests
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue