Added Elasticsearch in docs + ci-test build

This commit is contained in:
Enrico Zimuel 2025-11-06 15:37:31 +01:00
parent 9bfe5245a0
commit 98a3df09bb
No known key found for this signature in database
GPG key ID: 6CB203F6934A69F1
9 changed files with 7 additions and 4 deletions

View file

@ -9,7 +9,7 @@ sidebar_position: 2
The goal of Llama Stack is to build an ecosystem where users can easily swap out different implementations for the same API. Examples for these include: The goal of Llama Stack is to build an ecosystem where users can easily swap out different implementations for the same API. Examples for these include:
- LLM inference providers (e.g., Fireworks, Together, AWS Bedrock, Groq, Cerebras, SambaNova, vLLM, etc.), - LLM inference providers (e.g., Fireworks, Together, AWS Bedrock, Groq, Cerebras, SambaNova, vLLM, etc.),
- Vector databases (e.g., ChromaDB, Weaviate, Qdrant, Milvus, FAISS, PGVector, etc.), - Vector databases (e.g., ChromaDB, Weaviate, Qdrant, Milvus, FAISS, PGVector, Elasticsearch, etc.),
- Safety providers (e.g., Meta's Llama Guard, AWS Bedrock Guardrails, etc.) - Safety providers (e.g., Meta's Llama Guard, AWS Bedrock Guardrails, etc.)
Providers come in two flavors: Providers come in two flavors:

View file

@ -54,7 +54,7 @@ Llama Stack consists of a server (with multiple pluggable API providers) and Cli
Llama Stack provides adapters for popular providers across all API categories: Llama Stack provides adapters for popular providers across all API categories:
- **Inference**: Meta Reference, Ollama, Fireworks, Together, NVIDIA, vLLM, AWS Bedrock, OpenAI, Anthropic, and more - **Inference**: Meta Reference, Ollama, Fireworks, Together, NVIDIA, vLLM, AWS Bedrock, OpenAI, Anthropic, and more
- **Vector Databases**: FAISS, Chroma, Milvus, Postgres, Weaviate, Qdrant, and others - **Vector Databases**: FAISS, Chroma, Milvus, Postgres, Weaviate, Qdrant, Elasticsearch and others
- **Safety**: Llama Guard, Prompt Guard, Code Scanner, AWS Bedrock - **Safety**: Llama Guard, Prompt Guard, Code Scanner, AWS Bedrock
- **Training & Evaluation**: HuggingFace, TorchTune, NVIDIA NEMO - **Training & Evaluation**: HuggingFace, TorchTune, NVIDIA NEMO

View file

@ -9,7 +9,7 @@ sidebar_position: 1
The goal of Llama Stack is to build an ecosystem where users can easily swap out different implementations for the same API. Examples for these include: The goal of Llama Stack is to build an ecosystem where users can easily swap out different implementations for the same API. Examples for these include:
- LLM inference providers (e.g., Meta Reference, Ollama, Fireworks, Together, AWS Bedrock, Groq, Cerebras, SambaNova, vLLM, OpenAI, Anthropic, Gemini, WatsonX, etc.), - LLM inference providers (e.g., Meta Reference, Ollama, Fireworks, Together, AWS Bedrock, Groq, Cerebras, SambaNova, vLLM, OpenAI, Anthropic, Gemini, WatsonX, etc.),
- Vector databases (e.g., FAISS, SQLite-Vec, ChromaDB, Weaviate, Qdrant, Milvus, PGVector, etc.), - Vector databases (e.g., FAISS, SQLite-Vec, ChromaDB, Weaviate, Qdrant, Milvus, PGVector, Elasticsearch, etc.),
- Safety providers (e.g., Meta's Llama Guard, Prompt Guard, Code Scanner, AWS Bedrock Guardrails, etc.), - Safety providers (e.g., Meta's Llama Guard, Prompt Guard, Code Scanner, AWS Bedrock Guardrails, etc.),
- Tool Runtime providers (e.g., RAG Runtime, Brave Search, etc.) - Tool Runtime providers (e.g., RAG Runtime, Brave Search, etc.)

View file

@ -159,7 +159,8 @@ const sidebars: SidebarsConfig = {
'providers/vector_io/remote_milvus', 'providers/vector_io/remote_milvus',
'providers/vector_io/remote_pgvector', 'providers/vector_io/remote_pgvector',
'providers/vector_io/remote_qdrant', 'providers/vector_io/remote_qdrant',
'providers/vector_io/remote_weaviate' 'providers/vector_io/remote_weaviate',
'providers/vector_io/remote_elasticsearch'
], ],
}, },
{ {

View file

@ -323,6 +323,7 @@ exclude = [
"^src/llama_stack/providers/remote/vector_io/qdrant/", "^src/llama_stack/providers/remote/vector_io/qdrant/",
"^src/llama_stack/providers/remote/vector_io/sample/", "^src/llama_stack/providers/remote/vector_io/sample/",
"^src/llama_stack/providers/remote/vector_io/weaviate/", "^src/llama_stack/providers/remote/vector_io/weaviate/",
"^src/llama_stack/providers/remote/vector_io/elasticsearch/",
"^src/llama_stack/providers/utils/bedrock/client\\.py$", "^src/llama_stack/providers/utils/bedrock/client\\.py$",
"^src/llama_stack/providers/utils/bedrock/refreshable_boto_session\\.py$", "^src/llama_stack/providers/utils/bedrock/refreshable_boto_session\\.py$",
"^src/llama_stack/providers/utils/inference/embedding_mixin\\.py$", "^src/llama_stack/providers/utils/inference/embedding_mixin\\.py$",

View file

@ -27,6 +27,7 @@ distribution_spec:
- provider_type: remote::pgvector - provider_type: remote::pgvector
- provider_type: remote::qdrant - provider_type: remote::qdrant
- provider_type: remote::weaviate - provider_type: remote::weaviate
- provider_type: remote::elasticsearch
files: files:
- provider_type: inline::localfs - provider_type: inline::localfs
safety: safety: