mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-28 19:04:19 +00:00
chore: remove the incorrect output (#1472)
# What does this PR do?
[Provide a short summary of what this PR does and why. Link to relevant
issues if applicable.]
[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])
Based on the client output changed, so the output is incorrect:
458e20702b/src/llama_stack_client/lib/cli/models/models.py (L52)
and
https://github.com/meta-llama/llama-stack/pull/1348#pullrequestreview-2654971315
previous discussion that no need to maintain the output, so remove it.
## Test Plan
[Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.*]
[//]: # (## Documentation)
Signed-off-by: reidliu <reid201711@gmail.com>
Co-authored-by: reidliu <reid201711@gmail.com>
This commit is contained in:
parent
c4b229f2c9
commit
40cd48fa09
1 changed files with 0 additions and 22 deletions
|
@ -17,26 +17,4 @@ $ llama-stack-client configure --endpoint https://llamastack-preview.fireworks.a
|
||||||
$ llama-stack-client models list
|
$ llama-stack-client models list
|
||||||
```
|
```
|
||||||
|
|
||||||
You will see outputs:
|
|
||||||
```
|
|
||||||
$ llama-stack-client models list
|
|
||||||
+------------------------------+------------------------------+---------------+------------+
|
|
||||||
| identifier | llama_model | provider_id | metadata |
|
|
||||||
+==============================+==============================+===============+============+
|
|
||||||
| Llama3.1-8B-Instruct | Llama3.1-8B-Instruct | fireworks0 | {} |
|
|
||||||
+------------------------------+------------------------------+---------------+------------+
|
|
||||||
| Llama3.1-70B-Instruct | Llama3.1-70B-Instruct | fireworks0 | {} |
|
|
||||||
+------------------------------+------------------------------+---------------+------------+
|
|
||||||
| Llama3.1-405B-Instruct | Llama3.1-405B-Instruct | fireworks0 | {} |
|
|
||||||
+------------------------------+------------------------------+---------------+------------+
|
|
||||||
| Llama3.2-1B-Instruct | Llama3.2-1B-Instruct | fireworks0 | {} |
|
|
||||||
+------------------------------+------------------------------+---------------+------------+
|
|
||||||
| Llama3.2-3B-Instruct | Llama3.2-3B-Instruct | fireworks0 | {} |
|
|
||||||
+------------------------------+------------------------------+---------------+------------+
|
|
||||||
| Llama3.2-11B-Vision-Instruct | Llama3.2-11B-Vision-Instruct | fireworks0 | {} |
|
|
||||||
+------------------------------+------------------------------+---------------+------------+
|
|
||||||
| Llama3.2-90B-Vision-Instruct | Llama3.2-90B-Vision-Instruct | fireworks0 | {} |
|
|
||||||
+------------------------------+------------------------------+---------------+------------+
|
|
||||||
```
|
|
||||||
|
|
||||||
Checkout the [llama-stack-client-python](https://github.com/meta-llama/llama-stack-client-python/blob/main/docs/cli_reference.md) repo for more details on how to use the `llama-stack-client` CLI. Checkout [llama-stack-app](https://github.com/meta-llama/llama-stack-apps/tree/main) for examples applications built on top of Llama Stack.
|
Checkout the [llama-stack-client-python](https://github.com/meta-llama/llama-stack-client-python/blob/main/docs/cli_reference.md) repo for more details on how to use the `llama-stack-client` CLI. Checkout [llama-stack-app](https://github.com/meta-llama/llama-stack-apps/tree/main) for examples applications built on top of Llama Stack.
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue