llama-stack-mirror/llama_stack/providers/inline/inference
Ashwin Bharambe 6f9d622340
fix(api): update embeddings signature so inputs and outputs list align (#1161)
See Issue #922 

The change is slightly backwards incompatible but no callsite (in our
client codebases or stack-apps) every passes a depth-2
`List[List[InterleavedContentItem]]` (which is now disallowed.)

## Test Plan

```bash
$ cd llama_stack/providers/tests/inference
$ pytest -s -v -k fireworks test_embeddings.py \
   --inference-model nomic-ai/nomic-embed-text-v1.5 --env EMBEDDING_DIMENSION=784
$  pytest -s -v -k together test_embeddings.py \
   --inference-model togethercomputer/m2-bert-80M-8k-retrieval --env EMBEDDING_DIMENSION=784
$ pytest -s -v -k ollama test_embeddings.py \
   --inference-model all-minilm:latest --env EMBEDDING_DIMENSION=784
```

Also ran `tests/client-sdk/inference/test_embeddings.py`
2025-02-20 21:43:13 -08:00
..
meta_reference ModelAlias -> ProviderModelEntry 2025-02-20 14:02:36 -08:00
sentence_transformers build: format codebase imports using ruff linter (#1028) 2025-02-13 10:06:21 -08:00
vllm fix(api): update embeddings signature so inputs and outputs list align (#1161) 2025-02-20 21:43:13 -08:00
__init__.py precommit 2024-11-08 17:58:58 -08:00