mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-26 14:17:44 +00:00
fix: remove sentence-transformers from remote vllm
vLLM itself can perform the embeddings generation so we don't need this extra provider. Signed-off-by: Sébastien Han <seb@redhat.com>
This commit is contained in:
parent
3c9a10d2fe
commit
7e30b5a466
5 changed files with 6 additions and 22 deletions
|
@ -4,7 +4,6 @@ distribution_spec:
|
|||
providers:
|
||||
inference:
|
||||
- remote::vllm
|
||||
- inline::sentence-transformers
|
||||
vector_io:
|
||||
- inline::faiss
|
||||
- remote::chromadb
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue