llama-stack-mirror/docs/source/distributions/index.md
2024-11-22 18:04:48 -08:00

1.7 KiB

Starting a Llama Stack

:maxdepth: 3
:hidden:

importing_as_library
self_hosted_distro/index
remote_hosted_distro/index
building_distro
ondevice_distro/index

You can start a Llama Stack server using "distributions" (see Concepts) in one of the following ways:

  • Docker: we provide a number of pre-built Docker containers allowing you to get started instantly. If you are focused on application development, we recommend this option. You can also build your own custom Docker container.
  • Conda: the llama CLI provides a simple set of commands to build, configure and run a Llama Stack server containing the exact combination of providers you wish. We have provided various templates to make getting started easier.

Which distribution to choose depends on the hardware you have for running LLM inference.

You can also build your own custom distribution.