From 1e2d5d0731fbc8b9077734cd59bd590c8274c101 Mon Sep 17 00:00:00 2001 From: Francisco Javier Arceo Date: Thu, 20 Feb 2025 21:20:13 -0500 Subject: [PATCH] updated docs Signed-off-by: Francisco Javier Arceo --- docs/source/index.md | 2 + docs/source/providers/index.md | 45 +++++++++++++++++++ docs/source/providers/vector_db/chromadb.md | 33 ++++++++++++++ docs/source/providers/vector_db/faiss.md | 30 +++++++++++++ docs/source/providers/vector_db/index.md | 12 +++++ docs/source/providers/vector_db/pgvector.md | 28 ++++++++++++ docs/source/providers/vector_db/qdrant.md | 28 ++++++++++++ docs/source/providers/vector_db/sqlite-vec.md | 30 +++++++++++++ 8 files changed, 208 insertions(+) create mode 100644 docs/source/providers/index.md create mode 100644 docs/source/providers/vector_db/chromadb.md create mode 100644 docs/source/providers/vector_db/faiss.md create mode 100644 docs/source/providers/vector_db/index.md create mode 100644 docs/source/providers/vector_db/pgvector.md create mode 100644 docs/source/providers/vector_db/qdrant.md create mode 100644 docs/source/providers/vector_db/sqlite-vec.md diff --git a/docs/source/index.md b/docs/source/index.md index cb2355bfd..b6fd314b7 100644 --- a/docs/source/index.md +++ b/docs/source/index.md @@ -67,6 +67,7 @@ A number of "adapters" are available for some popular Inference and Vector Store | **Provider** | **Environments** | | :----: | :----: | | FAISS | Single Node | +| SQLite-Vec| Single Node | | Chroma | Hosted and Single Node | | Postgres (PGVector) | Hosted and Single Node | | Weaviate | Hosted | @@ -88,6 +89,7 @@ self introduction/index getting_started/index concepts/index +providers/index distributions/index distributions/selection building_applications/index diff --git a/docs/source/providers/index.md b/docs/source/providers/index.md new file mode 100644 index 000000000..335cad69f --- /dev/null +++ b/docs/source/providers/index.md @@ -0,0 +1,45 @@ +# Providers Overview + +The goal of Llama Stack is to build an ecosystem where users can easily swap out different implementations for the same API. Examples for these include: +- LLM inference providers (e.g., Fireworks, Together, AWS Bedrock, Groq, Cerebras, SambaNova, etc.), +- Vector databases (e.g., ChromaDB, Weaviate, Qdrant, FAISS, PGVector, etc.), +- Safety providers (e.g., Meta's Llama Guard, AWS Bedrock Guardrails, etc.) + +Providers come in two flavors: +- **Remote**: the provider runs as a separate service external to the Llama Stack codebase. Llama Stack contains a small amount of adapter code. +- **Inline**: the provider is fully specified and implemented within the Llama Stack codebase. It may be a simple wrapper around an existing library, or a full fledged implementation within Llama Stack. + +Importantly, Llama Stack always strives to provide at least one fully "local" provider for each API so you can iterate on a fully featured environment locally. + +## Agents + +## DatasetIO + +## Eval + +## Inference + +## iOS + +## Post Training + +## Safety + +## Scoring + +## Telemetry + +## Tool Runtime + +## [Vector DBs](vector_db/index.md) + +```{toctree} +:maxdepth: 1 + +vector_db/chromadb +vector_db/sqlite-vec +vector_db/faiss +vector_db/pgvector +vector_db/qdrant +vector_db/weaviate +``` diff --git a/docs/source/providers/vector_db/chromadb.md b/docs/source/providers/vector_db/chromadb.md new file mode 100644 index 000000000..d0826aca4 --- /dev/null +++ b/docs/source/providers/vector_db/chromadb.md @@ -0,0 +1,33 @@ +# Chroma + +[Chroma](https://www.trychroma.com/) is an inline and remote vector +database provider for Llama Stack. It allows you to store and query vectors directly within a Chroma database. +That means you're not limited to storing vectors in memory or in a separate service. + +## Features +Chroma supports: +- Store embeddings and their metadata +- Vector search +- Full-text search +- Document storage +- Metadata filtering +- Multi-modal retrieval + +## Usage + +To use Chrome in your Llama Stack project, follow these steps: + +1. Install the necessary dependencies. +2. Configure your Llama Stack project to use chroma. +3. Start storing and querying vectors. + +## Installation + +You can install chroma using pip: + +```bash +pip install chromadb +``` + +## Documentation +See [Chroma's documentation](https://docs.trychroma.com/docs/overview/introduction) for more details about Chroma in general. diff --git a/docs/source/providers/vector_db/faiss.md b/docs/source/providers/vector_db/faiss.md new file mode 100644 index 000000000..ba7e26c34 --- /dev/null +++ b/docs/source/providers/vector_db/faiss.md @@ -0,0 +1,30 @@ +# Faiss + +[Faiss](https://github.com/facebookresearch/faiss) is an inline vector database provider for Llama Stack. It +allows you to store and query vectors directly in memory. +That means you'll get fast and efficient vector retrieval. + +## Features + +- Lightweight and easy to use +- Fully integrated with Llama Stack +- GPU support + +## Usage + +To use faiss in your Llama Stack project, follow these steps: + +1. Install the necessary dependencies. +2. Configure your Llama Stack project to use Faiss. +3. Start storing and querying vectors. + +## Installation + +You can install faiss using pip: + +```bash +pip install faiss-cpu +``` +## Documentation +See [Faiss' documentation](https://faiss.ai/) or the [Faiss Wiki](https://github.com/facebookresearch/faiss/wiki) for +more details about Faiss in general. diff --git a/docs/source/providers/vector_db/index.md b/docs/source/providers/vector_db/index.md new file mode 100644 index 000000000..5da0d36e4 --- /dev/null +++ b/docs/source/providers/vector_db/index.md @@ -0,0 +1,12 @@ +## Vector DB Providers + +The goal of Llama Stack is to build an ecosystem where users can easily swap out different implementations +for the same Vector Database. + +Examples for these include: +- [FAISS](vector_db/faiss.md) (inline) +- [SQLite-Vec](vector_db/sqlite-vec.md) (inline) +- [ChromaDB](vector_db/chromadb.md) (inline and remote) +- [Weaviate](vector_db/weaviate.md) (remote) +- [Qdrant](vector_db/qdrant.md) (remote) +- [PGVector](vector_db/pgvector.md) (remote) diff --git a/docs/source/providers/vector_db/pgvector.md b/docs/source/providers/vector_db/pgvector.md new file mode 100644 index 000000000..ef3bd628d --- /dev/null +++ b/docs/source/providers/vector_db/pgvector.md @@ -0,0 +1,28 @@ +# Postgres PGVector + +[PGVector](https://github.com/pgvector/pgvector) is a remote vector database provider for Llama Stack. It +allows you to store and query vectors directly in memory. +That means you'll get fast and efficient vector retrieval. + +## Features + +- Easy to use +- Fully integrated with Llama Stack + +## Usage + +To use PGVector in your Llama Stack project, follow these steps: + +1. Install the necessary dependencies. +2. Configure your Llama Stack project to use Faiss. +3. Start storing and querying vectors. + +## Installation + +You can install PGVector using docker: + +```bash +docker pull pgvector/pgvector:pg17 +``` +## Documentation +See [PGVector's documentation](https://github.com/pgvector/pgvector) for more details about PGVector in general. diff --git a/docs/source/providers/vector_db/qdrant.md b/docs/source/providers/vector_db/qdrant.md new file mode 100644 index 000000000..7097ecec4 --- /dev/null +++ b/docs/source/providers/vector_db/qdrant.md @@ -0,0 +1,28 @@ +# Qdrant + +[Qdrant](https://qdrant.tech/documentation/) is a remote vector database provider for Llama Stack. It +allows you to store and query vectors directly in memory. +That means you'll get fast and efficient vector retrieval. + +## Features + +- Easy to use +- Fully integrated with Llama Stack + +## Usage + +To use Qdrant in your Llama Stack project, follow these steps: + +1. Install the necessary dependencies. +2. Configure your Llama Stack project to use Faiss. +3. Start storing and querying vectors. + +## Installation + +You can install Qdrant using docker: + +```bash +docker pull qdrant/qdrant +``` +## Documentation +See the [Qdrant documentation](https://qdrant.tech/documentation/) for more details about Qdrant in general. diff --git a/docs/source/providers/vector_db/sqlite-vec.md b/docs/source/providers/vector_db/sqlite-vec.md new file mode 100644 index 000000000..2fd2782f3 --- /dev/null +++ b/docs/source/providers/vector_db/sqlite-vec.md @@ -0,0 +1,30 @@ +# SQLite-Vec + +[SQLite-Vec](https://github.com/asg017/sqlite-vec) is an inline vector database provider for Llama Stack. It +allows you to store and query vectors directly within an SQLite database. +That means you're not limited to storing vectors in memory or in a separate service. + +## Features + +- Lightweight and easy to use +- Fully integrated with Llama Stack + +## Usage + +To use SQLite-Vec in your Llama Stack project, follow these steps: + +1. Install the necessary dependencies. +2. Configure your Llama Stack project to use SQLite-Vec. +3. Start storing and querying vectors. + +## Installation + +You can install SQLite-Vec using pip: + +```bash +pip install sqlite-vec +``` + +## Documentation + +See [sqlite-vec's GitHub repo](https://github.com/asg017/sqlite-vec/tree/main) for more details about sqlite-vec in general.