llama-stack-mirror/docs/docs/providers/vector_io/inline_chromadb.mdx
Charlie Doern 7605631df7 fix: Pydantic v2 discriminated union default handling
Remove invalid default parameters from KVStoreConfig and SqlStoreConfig
  Annotated types which were causing UnsupportedFieldAttributeWarning and
  producing incorrect string values instead of proper config objects.

  Add proper default_factory to all KVStoreConfig and SqlStoreConfig fields
  across core datatypes and provider configs, ensuring they instantiate
  SqliteKVStoreConfig or SqliteSqlStoreConfig objects with correct defaults.

  This improves usability by allowing configs to be instantiated without
  explicitly providing storage configuration while maintaining type safety
  and discriminated union functionality.

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-10-20 14:12:28 -04:00

91 lines
2.5 KiB
Text

---
description: |
[Chroma](https://www.trychroma.com/) is an inline and remote vector
database provider for Llama Stack. It allows you to store and query vectors directly within a Chroma database.
That means you're not limited to storing vectors in memory or in a separate service.
## Features
Chroma supports:
- Store embeddings and their metadata
- Vector search
- Full-text search
- Document storage
- Metadata filtering
- Multi-modal retrieval
## Usage
To use Chrome in your Llama Stack project, follow these steps:
1. Install the necessary dependencies.
2. Configure your Llama Stack project to use chroma.
3. Start storing and querying vectors.
## Installation
You can install chroma using pip:
```bash
pip install chromadb
```
## Documentation
See [Chroma's documentation](https://docs.trychroma.com/docs/overview/introduction) for more details about Chroma in general.
sidebar_label: Chromadb
title: inline::chromadb
---
# inline::chromadb
## Description
[Chroma](https://www.trychroma.com/) is an inline and remote vector
database provider for Llama Stack. It allows you to store and query vectors directly within a Chroma database.
That means you're not limited to storing vectors in memory or in a separate service.
## Features
Chroma supports:
- Store embeddings and their metadata
- Vector search
- Full-text search
- Document storage
- Metadata filtering
- Multi-modal retrieval
## Usage
To use Chrome in your Llama Stack project, follow these steps:
1. Install the necessary dependencies.
2. Configure your Llama Stack project to use chroma.
3. Start storing and querying vectors.
## Installation
You can install chroma using pip:
```bash
pip install chromadb
```
## Documentation
See [Chroma's documentation](https://docs.trychroma.com/docs/overview/introduction) for more details about Chroma in general.
## Configuration
| Field | Type | Required | Default | Description |
|-------|------|----------|---------|-------------|
| `db_path` | `<class 'str'>` | No | | |
| `kvstore` | `utils.kvstore.config.RedisKVStoreConfig \| utils.kvstore.config.SqliteKVStoreConfig \| utils.kvstore.config.PostgresKVStoreConfig \| utils.kvstore.config.MongoDBKVStoreConfig` | No | namespace=None type='sqlite' db_path='~/.llama/runtime/kvstore.db' | Config for KV store backend |
## Sample Configuration
```yaml
db_path: ${env.CHROMADB_PATH}
kvstore:
type: sqlite
db_path: ${env.SQLITE_STORE_DIR:=~/.llama/dummy}/chroma_inline_registry.db
```