llama-stack-mirror/llama_stack
Matthew Farrellee 9a6e91cd93
fix: chromadb type hint (#2136)
```
$ INFERENCE_MODEL=meta-llama/Llama-3.2-3B-Instruct \
  CHROMADB_URL=http://localhost:8000 \
  llama stack build --image-type conda --image-name llama \
    --providers vector_io=remote::chromadb,inference=remote::ollama \
    --run
...
  File ".../llama_stack/providers/remote/vector_io/chroma/chroma.py", line 31, in <module>
    ChromaClientType = chromadb.AsyncHttpClient | chromadb.PersistentClient
TypeError: unsupported operand type(s) for |: 'function' and 'function'
```

issue: AsyncHttpClient and PersistentClient are functions that return
AsyncClientAPI and ClientAPI types, respectively. | cannot be used to
construct a type from functions.

previously the code was Union[AsyncHttpClient, PersistentClient], which
did not trigger an error

# What does this PR do?

Closes #2135
2025-05-12 06:27:01 -07:00
..
apis feat: add metrics query API (#1394) 2025-05-07 10:11:26 -07:00
cli chore(refact)!: simplify config management (#1105) 2025-05-07 09:18:12 -07:00
distribution fix: catch TimeoutError in place of asyncio.TimeoutError (#2131) 2025-05-12 11:49:59 +02:00
models fix: llama4 tool use prompt fix (#2103) 2025-05-06 22:18:31 -07:00
providers fix: chromadb type hint (#2136) 2025-05-12 06:27:01 -07:00
strong_typing chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
templates fix: revert "feat(provider): adding llama4 support in together inference provider (#2123)" (#2124) 2025-05-08 15:18:16 -07:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
schema_utils.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00