forked from phoenix-oss/llama-stack-mirror
# What does this PR do?
[Provide a short summary of what this PR does and why. Link to relevant
issues if applicable.]
`llama model list` or `llama model list --show-all` will list more or
all for the models, so add the `search` option to simplify the output.
```
$ llama model list --help
usage: llama model list [-h] [--show-all] [-s SEARCH]
Show available llama models
options:
-h, --help show this help message and exit
--show-all Show all models (not just defaults)
-s SEARCH, --search SEARCH
Search for the input string as a substring in the model descriptor(ID)
$ llama model list -s 70b
+-----------------------+-----------------------------------+----------------+
| Model Descriptor(ID) | Hugging Face Repo | Context Length |
+-----------------------+-----------------------------------+----------------+
| Llama3.1-70B | meta-llama/Llama-3.1-70B | 128K |
+-----------------------+-----------------------------------+----------------+
| Llama3.1-70B-Instruct | meta-llama/Llama-3.1-70B-Instruct | 128K |
+-----------------------+-----------------------------------+----------------+
| Llama3.3-70B-Instruct | meta-llama/Llama-3.3-70B-Instruct | 128K |
+-----------------------+-----------------------------------+----------------+
$ llama model list -s 3.1-8b
+----------------------+----------------------------------+----------------+
| Model Descriptor(ID) | Hugging Face Repo | Context Length |
+----------------------+----------------------------------+----------------+
| Llama3.1-8B | meta-llama/Llama-3.1-8B | 128K |
+----------------------+----------------------------------+----------------+
| Llama3.1-8B-Instruct | meta-llama/Llama-3.1-8B-Instruct | 128K |
+----------------------+----------------------------------+----------------+
$ llama model list --show-all -s pro
+----------------------+-----------------------------+----------------+
| Model Descriptor(ID) | Hugging Face Repo | Context Length |
+----------------------+-----------------------------+----------------+
| Prompt-Guard-86M | meta-llama/Prompt-Guard-86M | 2K |
+----------------------+-----------------------------+----------------+
$ llama model list -s k
Not found for search.
```
[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])
## Test Plan
[Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.*]
[//]: # (## Documentation)
Signed-off-by: reidliu <reid201711@gmail.com>
Co-authored-by: reidliu <reid201711@gmail.com>
|
||
|---|---|---|
| .. | ||
| __init__.py | ||
| describe.py | ||
| download.py | ||
| list.py | ||
| model.py | ||
| prompt_format.py | ||
| remove.py | ||
| safety_models.py | ||
| verify_download.py | ||