llama-stack-mirror/llama_stack/distribution/docker
Xi Yan c4d5d6bb91
Docker compose scripts for remote adapters (#241)
* tgi docker compose

* path

* wait for tgi server to start before starting server

* update provider-id

* move scripts to distribution/ folder

* add readme

* readme
2024-10-15 16:32:53 -07:00
..
tgi Docker compose scripts for remote adapters (#241) 2024-10-15 16:32:53 -07:00
README.md Docker compose scripts for remote adapters (#241) 2024-10-15 16:32:53 -07:00

Docker Compose Scripts

This folder contains scripts to enable starting a distribution using docker compose.

Example: TGI Inference Adapter

$ cd llama_stack/distribution/docker/tgi
$ ls
compose.yaml  tgi-run.yaml
$ docker compose up

The script will first start up TGI server, then start up Llama Stack distribution server hooking up to the remote TGI provider for inference. You should be able to see the following outputs --

[text-generation-inference] | 2024-10-15T18:56:33.810397Z  INFO text_generation_router::server: router/src/server.rs:1813: Using config Some(Llama)
[text-generation-inference] | 2024-10-15T18:56:33.810448Z  WARN text_generation_router::server: router/src/server.rs:1960: Invalid hostname, defaulting to 0.0.0.0
[text-generation-inference] | 2024-10-15T18:56:33.864143Z  INFO text_generation_router::server: router/src/server.rs:2353: Connected
INFO:     Started server process [1]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://[::]:5000 (Press CTRL+C to quit)

To kill the server

docker compose down