diff --git a/docs/source/getting_started/distributions/index.md b/docs/source/getting_started/distributions/index.md deleted file mode 100644 index c98a6bad6..000000000 --- a/docs/source/getting_started/distributions/index.md +++ /dev/null @@ -1,18 +0,0 @@ -# Llama Stack Distribution - -A Distribution is where APIs and Providers are assembled together to provide a consistent whole to the end application developer. You can mix-and-match providers -- some could be backed by local code and some could be remote. As a hobbyist, you can serve a small model locally, but can choose a cloud provider for a large model. Regardless, the higher level APIs your app needs to work with don't need to change at all. You can even imagine moving across the server / mobile-device boundary as well always using the same uniform set of APIs for developing Generative AI applications. - -We offer three types of distributions: - -1. [Self-Hosted Distribution](./self_hosted_distro/index.md): If you want to run Llama Stack inference on your local machine. -2. [Remote-Hosted Distribution](./remote_hosted_distro/index.md): If you want to connect to a remote hosted inference provider. -3. [On-device Distribution](./ondevice_distro/index.md): If you want to run Llama Stack inference on your iOS / Android device. - -```{toctree} -:maxdepth: 1 -:hidden: - -self_hosted_distro/index -remote_hosted_distro/index -ondevice_distro/index -``` diff --git a/docs/source/getting_started/index.md b/docs/source/getting_started/index.md index de1b02db3..5c59f5f9f 100644 --- a/docs/source/getting_started/index.md +++ b/docs/source/getting_started/index.md @@ -1,15 +1,12 @@ # Getting Started ```{toctree} -:hidden: :maxdepth: 2 - -distributions/index -``` - -```{toctree} :hidden: -developer_cookbook + +distributions/self_hosted_distro/index +distributions/remote_hosted_distro/index +distributions/ondevice_distro/index ``` At the end of the guide, you will have learned how to: @@ -45,6 +42,16 @@ If so, we suggest: - [`distribution-together`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/together.html) - [`distribution-fireworks`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/fireworks.html) +- **Do you have an API key for a remote inference provider like Fireworks, Together, etc.?** If so, we suggest: + - [`distribution-together`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/together.html) + - [`distribution-fireworks`](https://llama-stack.readthedocs.io/en/latest/getting_started/distributions/remote_hosted_distro/fireworks.html) + +Please see our pages in detail for the types of distributions we offer: + +1. [Self-Hosted Distribution](./distributions/self_hosted_distro/index.md): If you want to run Llama Stack inference on your local machine. +2. [Remote-Hosted Distribution](./distributions/remote_hosted_distro/index.md): If you want to connect to a remote hosted inference provider. +3. [On-device Distribution](./distributions/ondevice_distro/index.md): If you want to run Llama Stack inference on your iOS / Android device. + ### Quick Start Commands