mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-27 06:28:50 +00:00
1.2 KiB
1.2 KiB
orphan |
---|
true |
inline::chromadb
Description
Chroma is an inline and remote vector database provider for Llama Stack. It allows you to store and query vectors directly within a Chroma database. That means you're not limited to storing vectors in memory or in a separate service.
Features
Chroma supports:
- Store embeddings and their metadata
- Vector search
- Full-text search
- Document storage
- Metadata filtering
- Multi-modal retrieval
Usage
To use Chrome in your Llama Stack project, follow these steps:
- Install the necessary dependencies.
- Configure your Llama Stack project to use chroma.
- Start storing and querying vectors.
Installation
You can install chroma using pip:
pip install chromadb
Documentation
See Chroma's documentation for more details about Chroma in general.
Configuration
Field | Type | Required | Default | Description |
---|---|---|---|---|
db_path |
<class 'str'> |
No | PydanticUndefined |
Sample Configuration
db_path: ${env.CHROMADB_PATH}