From c24bef1e01b1906cd56bff90718b32ab09cf9249 Mon Sep 17 00:00:00 2001 From: Hardik Shah Date: Tue, 21 Jan 2025 23:38:28 -0800 Subject: [PATCH] update README.md --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 792df43f8..b1878d7e4 100644 --- a/README.md +++ b/README.md @@ -63,6 +63,8 @@ We focus on making it easy to build production applications with the Llama model ### Distributions +A Llama Stack Distribution (or "distro") is a pre-configured bundle of provider implementations for each API component. Distributions make it easy to get started with a specific deployment scenario - you can begin with a local development setup (eg. ollama) and seamlessly transition to production (eg. Fireworks) without changing your application code. Here are some of the distributions we support: + | **Distribution** | **Llama Stack Docker** | Start This Distribution | |:---------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------:| | Meta Reference | [llamastack/distribution-meta-reference-gpu](https://hub.docker.com/repository/docker/llamastack/distribution-meta-reference-gpu/general) | [Guide](https://llama-stack.readthedocs.io/en/latest/distributions/self_hosted_distro/meta-reference-gpu.html) |