mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-13 23:42:37 +00:00
readme
This commit is contained in:
parent
5ea36b0274
commit
29c8edb4f6
2 changed files with 34 additions and 1 deletions
|
|
@ -49,7 +49,7 @@ docker run -it -p 5000:5000 -v ~/.llama:/root/.llama --gpus=all llamastack/llama
|
|||
```
|
||||
|
||||
> [!TIP]
|
||||
> Pro Tip: We may use `docker compose up` for starting up a distribution with remote providers (e.g. TGI) using [llamastack-local-cpu](https://hub.docker.com/repository/docker/llamastack/llamastack-local-cpu/general). You can checkout [these scripts](../llama_stack/distribution/docker/README.md) to help you get started.
|
||||
> Pro Tip: We may use `docker compose up` for starting up a distribution with remote providers (e.g. TGI) using [llamastack-local-cpu](https://hub.docker.com/repository/docker/llamastack/llamastack-local-cpu/general). You can checkout [these scripts](../distributions/) to help you get started.
|
||||
|
||||
#### Build->Configure->Run Llama Stack server via conda
|
||||
You may also build a LlamaStack distribution from scratch, configure it, and start running the distribution. This is useful for developing on LlamaStack.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue