llama-stack-mirror/docs
Sébastien Han c4349f532b
feat: consolidate most distros into "starter" (#2516)
# What does this PR do?

* Removes a bunch of distros
* Removed distros were added into the "starter" distribution
* Doc for "starter" has been added
* Partially reverts https://github.com/meta-llama/llama-stack/pull/2482
  since inference providers are disabled by default and can be turned on
  manually via env variable.
* Disables safety in starter distro

Closes: https://github.com/meta-llama/llama-stack/issues/2502.

~Needs: https://github.com/meta-llama/llama-stack/pull/2482 for Ollama
to work properly in the CI.~

TODO:

- [ ] We can only update `install.sh` when we get a new release.
- [x] Update providers documentation
- [ ] Update notebooks to reference starter instead of ollama

Signed-off-by: Sébastien Han <seb@redhat.com>
2025-07-04 15:58:03 +02:00
..
_static feat: Add webmethod for deleting openai responses (#2160) 2025-06-30 11:28:02 +02:00
notebooks feat: Add Nvidia e2e beginner notebook and tool calling notebook (#1964) 2025-06-16 11:29:01 -04:00
openapi_generator feat: Add webmethod for deleting openai responses (#2160) 2025-06-30 11:28:02 +02:00
resources Several documentation fixes and fix link to API reference 2025-02-04 14:00:43 -08:00
source feat: consolidate most distros into "starter" (#2516) 2025-07-04 15:58:03 +02:00
zero_to_hero_guide feat: consolidate most distros into "starter" (#2516) 2025-07-04 15:58:03 +02:00
conftest.py fix: sleep after notebook test 2025-03-23 14:03:35 -07:00
contbuild.sh Fix broken links with docs 2024-11-22 20:42:17 -08:00
dog.jpg Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
getting_started.ipynb docs: Add quick_start.ipynb notebook equivalent of index.md Quickstart guide (#2128) 2025-07-03 13:55:43 +02:00
getting_started_llama4.ipynb docs: Add quick_start.ipynb notebook equivalent of index.md Quickstart guide (#2128) 2025-07-03 13:55:43 +02:00
getting_started_llama_api.ipynb docs: Add quick_start.ipynb notebook equivalent of index.md Quickstart guide (#2128) 2025-07-03 13:55:43 +02:00
license_header.txt Initial commit 2024-07-23 08:32:33 -07:00
make.bat feat(pre-commit): enhance pre-commit hooks with additional checks (#2014) 2025-04-30 11:35:49 -07:00
Makefile first version of readthedocs (#278) 2024-10-22 10:15:58 +05:30
quick_start.ipynb docs: Add quick_start.ipynb notebook equivalent of index.md Quickstart guide (#2128) 2025-07-03 13:55:43 +02:00
readme.md chore: use groups when running commands (#2298) 2025-05-28 09:13:16 -07:00

Llama Stack Documentation

Here's a collection of comprehensive guides, examples, and resources for building AI applications with Llama Stack. For the complete documentation, visit our ReadTheDocs page.

Render locally

From the llama-stack root directory, run the following command to render the docs locally:

uv run --group docs sphinx-autobuild docs/source docs/build/html --write-all

You can open up the docs in your browser at http://localhost:8000

Content

Try out Llama Stack's capabilities through our detailed Jupyter notebooks: