docs: update container deployment guides for distributions

This commit is contained in:
r3v5 2025-07-21 10:33:12 +01:00
parent ecdcfb28ca
commit f009c0b534
No known key found for this signature in database
GPG key ID: 7758B9F272DE67D9
9 changed files with 19 additions and 19 deletions

View file

@ -65,7 +65,7 @@ registry.dell.huggingface.co/enterprise-dell-inference-meta-llama-meta-llama-3.1
#### Start Llama Stack server pointing to TGI server
```
docker run --pull always --network host -it -p 8321:8321 -v ./run.yaml:/root/my-run.yaml --gpus=all llamastack/distribution-tgi --yaml_config /root/my-run.yaml
docker run --pull always --network host -it -p 8321:8321 -v ./run.yaml:/.llama/my-run.yaml --gpus=all llamastack/distribution-tgi --yaml_config /.llama/my-run.yaml
```
Make sure in you `run.yaml` file, you inference provider is pointing to the correct TGI server endpoint. E.g.