llama-stack/llama_stack
Ben Browning 406465622e
fix: Update QdrantConfig to QdrantVectorIOConfig (#1104)
# What does this PR do?

This fixes an import introduced due to merging #1079 before #1039, and
thus the changes from #1039 needing to update `QdrantConfig` to
`QdrantVectorIOConfig`.


## Test Plan

I ran the remote vllm provider inference tests against the latest main:
```
VLLM_URL="http://localhost:8001/v1" python -m pytest -s -v llama_stack/providers/tests/inference/test_text_inference.py --providers "inference=vllm_remote"
```

That failed with:
```
  File "/home/bbrownin/src/llama-stack/llama_stack/providers/tests/vector_io/fixtures.py", line 20, in <module>
    from llama_stack.providers.remote.vector_io.qdrant import QdrantConfig
ImportError: Error importing plugin "llama_stack.providers.tests.vector_io.fixtures": cannot import name 'QdrantConfig' from 'llama_stack.providers.remote.vector_io.qdrant' (/home/bbrownin/src/llama-stack/llama_stack/providers/remote/vector_io/qdrant/__init__.py)
```

After this change, the import no longer fails and the tests pass.

Signed-off-by: Ben Browning <bbrownin@redhat.com>
2025-02-14 06:31:00 -08:00
..
apis fix: openapi for eval-task (#1085) 2025-02-13 17:10:45 -08:00
cli fix: add the missed help description info (#1096) 2025-02-13 21:31:36 -08:00
distribution fix: regex pattern matching to support :path suffix in the routes (#1089) 2025-02-13 18:18:23 -08:00
providers fix: Update QdrantConfig to QdrantVectorIOConfig (#1104) 2025-02-14 06:31:00 -08:00
scripts build: format codebase imports using ruff linter (#1028) 2025-02-13 10:06:21 -08:00
templates fix!: update eval-tasks -> benchmarks (#1032) 2025-02-13 16:40:58 -08:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00