diff --git a/docs/source/getting_started/distributions/index.md b/docs/source/getting_started/distributions/index.md index 94c676611..05ceb4787 100644 --- a/docs/source/getting_started/distributions/index.md +++ b/docs/source/getting_started/distributions/index.md @@ -3,7 +3,7 @@ A Distribution is where APIs and Providers are assembled together to provide a consistent whole to the end application developer. You can mix-and-match providers -- some could be backed by local code and some could be remote. As a hobbyist, you can serve a small model locally, but can choose a cloud provider for a large model. Regardless, the higher level APIs your app needs to work with don't need to change at all. You can even imagine moving across the server / mobile-device boundary as well always using the same uniform set of APIs for developing Generative AI applications. ```{toctree} -:maxdepth: 2 +:maxdepth: 1 meta-reference-gpu ``` diff --git a/docs/source/getting_started/index.md b/docs/source/getting_started/index.md index fbde781a6..882f8be52 100644 --- a/docs/source/getting_started/index.md +++ b/docs/source/getting_started/index.md @@ -73,7 +73,6 @@ You can find more example scripts with client SDKs to talk with the Llama Stack ```{toctree} -:hidden: :maxdepth: 2 developer_cookbook diff --git a/docs/source/index.md b/docs/source/index.md index 1093caceb..5cd24dc28 100644 --- a/docs/source/index.md +++ b/docs/source/index.md @@ -55,7 +55,7 @@ You can find more example scripts with client SDKs to talk with the Llama Stack ```{toctree} :hidden: -:maxdepth: 2 +:maxdepth: 3 getting_started/index cli_reference/index