llama-stack/docs/source/distributions/self_hosted_distro/index.md
2024-11-22 20:42:17 -08:00

1.3 KiB

Self-Hosted Distributions

:maxdepth: 1
:hidden:

ollama
tgi
remote-vllm
meta-reference-gpu
meta-reference-quantized-gpu
together
fireworks
bedrock

We offer deployable distributions where you can host your own Llama Stack server using local inference.

Distribution Llama Stack Docker Start This Distribution
Ollama {dockerhub}distribution-ollama Guide
TGI {dockerhub}distribution-tgi Guide
vLLM {dockerhub}distribution-remote-vllm Guide
Meta Reference {dockerhub}distribution-meta-reference-gpu Guide
Meta Reference Quantized {dockerhub}distribution-meta-reference-quantized-gpu Guide
Together {dockerhub}distribution-together Guide
Fireworks {dockerhub}distribution-fireworks Guide
Bedrock {dockerhub}distribution-bedrock Guide