llama-stack-mirror/llama_stack
Reid 94e2186bb8
chore: add subcommands description in help (#1219)
# What does this PR do?
[Provide a short summary of what this PR does and why. Link to relevant
issues if applicable.]

```
before:
$ llama
usage: llama [-h] {model,stack,download,verify-download} ...

Welcome to the Llama CLI

options:
  -h, --help            show this help message and exit

subcommands:
  {model,stack,download,verify-download}

$ llama model --help
usage: llama model [-h] {download,list,prompt-format,describe,verify-download,remove} ...

Work with llama models

options:
  -h, --help            show this help message and exit

model_subcommands:
  {download,list,prompt-format,describe,verify-download,remove}

$ llama stack --help
usage: llama stack [-h] [--version] {build,list-apis,list-providers,run} ...

Operations for the Llama Stack / Distributions

options:
  -h, --help            show this help message and exit
  --version             show program's version number and exit

stack_subcommands:
  {build,list-apis,list-providers,run}

===================
after:
$ llama
usage: llama [-h] {model,stack,download,verify-download} ...

Welcome to the Llama CLI

options:
  -h, --help            show this help message and exit

subcommands:
  {model,stack,download,verify-download}

  model                 Work with llama models
  stack                 Operations for the Llama Stack / Distributions
  download              Download a model from llama.meta.com or Hugging Face Hub
  verify-download       Verify integrity of downloaded model files

$ llama model --help
usage: llama model [-h] {download,list,prompt-format,describe,verify-download,remove} ...

Work with llama models

options:
  -h, --help            show this help message and exit

model_subcommands:
  {download,list,prompt-format,describe,verify-download,remove}

  download              Download a model from llama.meta.com or Hugging Face Hub
  list                  Show available llama models
  prompt-format         Show llama model message formats
  describe              Show details about a llama model
  verify-download       Verify the downloaded checkpoints' checksums for models downloaded from Meta
  remove                Remove the downloaded llama model

$ llama stack --help
usage: llama stack [-h] [--version] {build,list-apis,list-providers,run} ...

Operations for the Llama Stack / Distributions

options:
  -h, --help            show this help message and exit
  --version             show program's version number and exit

stack_subcommands:
  {build,list-apis,list-providers,run}

  build                 Build a Llama stack container
  list-apis             List APIs part of the Llama Stack implementation
  list-providers        Show available Llama Stack Providers for an API
  run                   Start the server for a Llama Stack Distribution. You should have already built (or downloaded) and configured the distribution.
```

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan
[Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.*]

[//]: # (## Documentation)

---------

Signed-off-by: reidliu <reid201711@gmail.com>
Co-authored-by: reidliu <reid201711@gmail.com>
2025-02-27 17:00:27 -08:00
..
apis ci: add mypy for static type checking (#1101) 2025-02-21 13:15:40 -08:00
cli chore: add subcommands description in help (#1219) 2025-02-27 17:00:27 -08:00
distribution fix(test): update client-sdk tests to handle tool format parametrization better (#1287) 2025-02-26 21:16:00 -08:00
models/llama chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00
providers feat: add nvidia embedding implementation for new signature, task_type, output_dimention, text_truncation (#1213) 2025-02-27 16:58:11 -08:00
scripts ci: add mypy for static type checking (#1101) 2025-02-21 13:15:40 -08:00
strong_typing Ensure that deprecations for fields follow through to OpenAPI 2025-02-19 13:54:04 -08:00
templates docs: update the output of llama-stack-client models list (#1271) 2025-02-27 16:46:38 -08:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
schema_utils.py ci: add mypy for static type checking (#1101) 2025-02-21 13:15:40 -08:00