mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-29 15:23:51 +00:00
depth
This commit is contained in:
parent
42104361a3
commit
eeab1278f2
3 changed files with 2 additions and 3 deletions
|
@ -3,7 +3,7 @@
|
|||
A Distribution is where APIs and Providers are assembled together to provide a consistent whole to the end application developer. You can mix-and-match providers -- some could be backed by local code and some could be remote. As a hobbyist, you can serve a small model locally, but can choose a cloud provider for a large model. Regardless, the higher level APIs your app needs to work with don't need to change at all. You can even imagine moving across the server / mobile-device boundary as well always using the same uniform set of APIs for developing Generative AI applications.
|
||||
|
||||
```{toctree}
|
||||
:maxdepth: 2
|
||||
:maxdepth: 1
|
||||
|
||||
meta-reference-gpu
|
||||
```
|
||||
|
|
|
@ -73,7 +73,6 @@ You can find more example scripts with client SDKs to talk with the Llama Stack
|
|||
|
||||
|
||||
```{toctree}
|
||||
:hidden:
|
||||
:maxdepth: 2
|
||||
|
||||
developer_cookbook
|
||||
|
|
|
@ -55,7 +55,7 @@ You can find more example scripts with client SDKs to talk with the Llama Stack
|
|||
|
||||
```{toctree}
|
||||
:hidden:
|
||||
:maxdepth: 2
|
||||
:maxdepth: 3
|
||||
|
||||
getting_started/index
|
||||
cli_reference/index
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue