feat: consolidate most distros into "starter"

* Removes a bunch of distros
* Removed distros were added into the "starter" distribution
* Doc for "starter" has been added
* Partially reverts https://github.com/meta-llama/llama-stack/pull/2482
  since inference providers are disabled by default and can be turned on
  manually via env variable.
* Disables safety in starter distro

Closes: #2502
Signed-off-by: Sébastien Han <seb@redhat.com>
This commit is contained in:
Sébastien Han 2025-06-25 16:09:41 +02:00
parent f1c62e0af0
commit 6d8e2c6212
No known key found for this signature in database
132 changed files with 1009 additions and 10845 deletions

View file

@ -9,13 +9,11 @@ Ollama inference provider for running local models through the Ollama runtime.
| Field | Type | Required | Default | Description |
|-------|------|----------|---------|-------------|
| `url` | `<class 'str'>` | No | http://localhost:11434 | |
| `raise_on_connect_error` | `<class 'bool'>` | No | True | |
## Sample Configuration
```yaml
url: ${env.OLLAMA_URL:=http://localhost:11434}
raise_on_connect_error: true
```

View file

@ -15,7 +15,7 @@ RunPod inference provider for running models on RunPod's cloud GPU platform.
```yaml
url: ${env.RUNPOD_URL:=}
api_token: ${env.RUNPOD_API_TOKEN:=}
api_token: ${env.RUNPOD_API_TOKEN}
```

View file

@ -15,7 +15,7 @@ Together AI inference provider for open-source models and collaborative AI devel
```yaml
url: https://api.together.xyz/v1
api_key: ${env.TOGETHER_API_KEY:=}
api_key: ${env.TOGETHER_API_KEY}
```

View file

@ -23,7 +23,7 @@ To use the HF SFTTrainer in your Llama Stack project, follow these steps:
You can access the HuggingFace trainer via the `ollama` distribution:
```bash
llama stack build --template ollama --image-type venv
llama stack build --template starter --image-type venv
llama stack run --image-type venv ~/.llama/distributions/ollama/ollama-run.yaml
```