mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-23 17:23:56 +00:00
Merge branch 'main' into kvstore-refactor
This commit is contained in:
commit
c1cc3d1ac3
1 changed files with 1 additions and 1 deletions
|
|
@ -77,7 +77,7 @@ ENABLE_OLLAMA=ollama INFERENCE_MODEL="llama3.2:3b" llama stack build --template
|
|||
You can use a container image to run the Llama Stack server. We provide several container images for the server
|
||||
component that works with different inference providers out of the box. For this guide, we will use
|
||||
`llamastack/distribution-starter` as the container image. If you'd like to build your own image or customize the
|
||||
configurations, please check out [this guide](../references/index.md).
|
||||
configurations, please check out [this guide](../distributions/building_distro.md).
|
||||
First lets setup some environment variables and create a local directory to mount into the container’s file system.
|
||||
```bash
|
||||
export INFERENCE_MODEL="llama3.2:3b"
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue