llama-stack/llama_stack
Charlie Doern 5f88ff0b6a
fix: show proper help text (#1065)
# What does this PR do?
when executing a sub-command like `llama model` the improper help text,
sub-commands, and flags are displayed. each command group needs to have
`.set_defaults` to display this info properly

before: 

```
llama model
usage: llama [-h] {model,stack,download,verify-download} ...

Welcome to the Llama CLI

options:
  -h, --help            show this help message and exit

subcommands:
  {model,stack,download,verify-download}
```

after:

```
llama model
usage: llama model [-h] {download,list,prompt-format,describe,verify-download} ...

Work with llama models

options:
  -h, --help            show this help message and exit

model_subcommands:
  {download,list,prompt-format,describe,verify-download}
```

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-02-12 06:38:25 -08:00
..
apis feat: make telemetry attributes be dict[str,PrimitiveType] (#1055) 2025-02-11 15:10:17 -08:00
cli fix: show proper help text (#1065) 2025-02-12 06:38:25 -08:00
distribution raise when client initialize fails 2025-02-07 12:24:07 -08:00
providers feat: Support tool calling for streaming chat completion in remote vLLM provider (#1063) 2025-02-12 06:17:21 -08:00
scripts fix: Gaps in doc codegen (#1035) 2025-02-10 13:24:15 -08:00
templates fix: a bad newline in ollama docs (#1036) 2025-02-10 14:27:17 -08:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00