feat: consolidate most distros into "starter"

* Removes a bunch of distros
* Removed distros were added into the "starter" distribution
* Doc for "starter" has been added
* Partially reverts https://github.com/meta-llama/llama-stack/pull/2482
  since inference providers are disabled by default and can be turned on
  manually via env variable.
* Disables safety in starter distro

Closes: #2502
Signed-off-by: Sébastien Han <seb@redhat.com>
This commit is contained in:
Sébastien Han 2025-06-25 16:09:41 +02:00
parent f1c62e0af0
commit 6d8e2c6212
No known key found for this signature in database
132 changed files with 1009 additions and 10845 deletions

View file

@ -141,9 +141,9 @@ You may then pick a template to build your distribution with providers fitted to
For example, to build a distribution with TGI as the inference provider, you can run:
```
$ llama stack build --template tgi
$ llama stack build --template starter
...
You can now edit ~/.llama/distributions/llamastack-tgi/tgi-run.yaml and run `llama stack run ~/.llama/distributions/llamastack-tgi/tgi-run.yaml`
You can now edit ~/.llama/distributions/llamastack-starter/starter-run.yaml and run `llama stack run ~/.llama/distributions/llamastack-starter/starter-run.yaml`
```
:::
:::{tab-item} Building from Scratch
@ -183,26 +183,7 @@ You can now edit ~/.llama/distributions/llamastack-my-local-stack/my-local-stack
- The config file will be of contents like the ones in `llama_stack/templates/*build.yaml`.
```
$ cat llama_stack/templates/ollama/build.yaml
name: ollama
distribution_spec:
description: Like local, but use ollama for running LLM inference
providers:
inference: remote::ollama
memory: inline::faiss
safety: inline::llama-guard
agents: inline::meta-reference
telemetry: inline::meta-reference
image_name: ollama
image_type: conda
# If some providers are external, you can specify the path to the implementation
external_providers_dir: ~/.llama/providers.d
```
```
llama stack build --config llama_stack/templates/ollama/build.yaml
llama stack build --config llama_stack/templates/starter/build.yaml
```
:::
@ -268,11 +249,11 @@ Podman is supported as an alternative to Docker. Set `CONTAINER_BINARY` to `podm
To build a container image, you may start off from a template and use the `--image-type container` flag to specify `container` as the build image type.
```
llama stack build --template ollama --image-type container
llama stack build --template starter --image-type container
```
```
$ llama stack build --template ollama --image-type container
$ llama stack build --template starter --image-type container
...
Containerfile created successfully in /tmp/tmp.viA3a3Rdsg/ContainerfileFROM python:3.10-slim
...