From 5366dab31e3dfecab455a9a6c5f55cc18c7c7ae6 Mon Sep 17 00:00:00 2001 From: Reid <61492567+reidliu41@users.noreply.github.com> Date: Sat, 1 Mar 2025 02:03:45 +0800 Subject: [PATCH] docs: update build doc (#1262) # What does this PR do? [Provide a short summary of what this PR does and why. Link to relevant issues if applicable.] https://github.com/meta-llama/llama-stack/blob/55eb257459f5f891d7e570740e816eed950131b3/llama_stack/cli/stack/run.py#L22 https://github.com/meta-llama/llama-stack/blob/55eb257459f5f891d7e570740e816eed950131b3/llama_stack/cli/stack/_build.py#L103 [//]: # (If resolving an issue, uncomment and update the line below) [//]: # (Closes #[issue-number]) ## Test Plan [Describe the tests you ran to verify your changes with result summaries. *Provide clear instructions so the plan can be easily re-executed.*] [//]: # (## Documentation) Signed-off-by: reidliu Co-authored-by: reidliu --- docs/source/distributions/building_distro.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/distributions/building_distro.md b/docs/source/distributions/building_distro.md index 9cb1a402f..20a835201 100644 --- a/docs/source/distributions/building_distro.md +++ b/docs/source/distributions/building_distro.md @@ -106,7 +106,7 @@ It would be best to start with a template and understand the structure of the co llama stack build > Enter a name for your Llama Stack (e.g. my-local-stack): my-stack -> Enter the image type you want your Llama Stack to be built as (container or conda): conda +> Enter the image type you want your Llama Stack to be built as (container or conda or venv): conda Llama Stack is composed of several APIs working together. Let's select the provider types (implementations) you want to use for these APIs. @@ -187,7 +187,7 @@ usage: llama stack run [-h] [--port PORT] [--image-name IMAGE_NAME] [--disable-i [--tls-certfile TLS_CERTFILE] [--image-type {conda,container,venv}] config -start the server for a Llama Stack Distribution. You should have already built (or downloaded) and configured the distribution. +Start the server for a Llama Stack Distribution. You should have already built (or downloaded) and configured the distribution. positional arguments: config Path to config file to use for the run