From 0ae8735fbd3e4964d534573cfd4bac3a89d5de6c Mon Sep 17 00:00:00 2001 From: reidliu Date: Sun, 2 Mar 2025 18:33:30 +0800 Subject: [PATCH] docs: update help text Signed-off-by: reidliu --- .../references/llama_cli_reference/index.md | 16 ++++++++-- .../llama_stack_client_cli_reference.md | 30 ++++++++++++++----- 2 files changed, 37 insertions(+), 9 deletions(-) diff --git a/docs/source/references/llama_cli_reference/index.md b/docs/source/references/llama_cli_reference/index.md index 8a38fc3ae..d6971175e 100644 --- a/docs/source/references/llama_cli_reference/index.md +++ b/docs/source/references/llama_cli_reference/index.md @@ -38,7 +38,7 @@ llama --help ``` ``` -usage: llama [-h] {download,model,stack} ... +usage: llama [-h] {model,stack,download,verify-download} ... Welcome to the Llama CLI @@ -46,7 +46,12 @@ options: -h, --help show this help message and exit subcommands: - {download,model,stack} + {model,stack,download,verify-download} + + model Work with llama models + stack Operations for the Llama Stack / Distributions + download Download a model from llama.meta.com or Hugging Face Hub + verify-download Verify integrity of downloaded model files ``` ## Downloading models @@ -212,6 +217,13 @@ options: model_subcommands: {download,list,prompt-format,describe,verify-download,remove} + + download Download a model from llama.meta.com or Hugging Face Hub + list Show available llama models + prompt-format Show llama model message formats + describe Show details about a llama model + verify-download Verify the downloaded checkpoints' checksums for models downloaded from Meta + remove Remove the downloaded llama model ``` ### Describe diff --git a/docs/source/references/llama_stack_client_cli_reference.md b/docs/source/references/llama_stack_client_cli_reference.md index 26b81cf92..5aec3a77f 100644 --- a/docs/source/references/llama_stack_client_cli_reference.md +++ b/docs/source/references/llama_stack_client_cli_reference.md @@ -6,17 +6,33 @@ The `llama-stack-client` CLI allows you to query information about the distribut ### `llama-stack-client` ```bash -llama-stack-client -h +llama-stack-client --help -usage: llama-stack-client [-h] {models,memory_banks,shields} ... +Usage: llama-stack-client [OPTIONS] COMMAND [ARGS]... -Welcome to the LlamaStackClient CLI + Welcome to the LlamaStackClient CLI -options: - -h, --help show this help message and exit +Options: + --version Show the version and exit. + --endpoint TEXT Llama Stack distribution endpoint + --api-key TEXT Llama Stack distribution API key + --config TEXT Path to config file + --help Show this message and exit. -subcommands: - {models,memory_banks,shields} +Commands: + configure Configure Llama Stack Client CLI. + datasets Manage datasets. + eval Run evaluation tasks. + eval_tasks Manage evaluation tasks. + inference Inference (chat). + inspect Inspect server configuration. + models Manage GenAI models. + post_training Post-training. + providers Manage API providers. + scoring_functions Manage scoring functions. + shields Manage safety shield services. + toolgroups Manage available tool groups. + vector_dbs Manage vector databases. ``` ### `llama-stack-client configure`