mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-08-09 19:58:29 +00:00
[docs] Show that llama-stack-client configure
will ask for api key
Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
This commit is contained in:
parent
0bec24c3db
commit
a107960fd8
1 changed files with 4 additions and 2 deletions
|
@ -74,8 +74,10 @@ pip install llama-stack-client
|
||||||
Let's use the `llama-stack-client` CLI to check the connectivity to the server.
|
Let's use the `llama-stack-client` CLI to check the connectivity to the server.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
llama-stack-client configure --endpoint http://localhost:$LLAMA_STACK_PORT
|
$ llama-stack-client configure --endpoint http://localhost:$LLAMA_STACK_PORT
|
||||||
llama-stack-client models list
|
> Enter the API key (leave empty if no key is needed):
|
||||||
|
Done! You can now use the Llama Stack Client CLI with endpoint http://localhost:8321
|
||||||
|
$ llama-stack-client models list
|
||||||
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━┓
|
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━┓
|
||||||
┃ identifier ┃ provider_id ┃ provider_resource_id ┃ metadata ┃
|
┃ identifier ┃ provider_id ┃ provider_resource_id ┃ metadata ┃
|
||||||
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━┩
|
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━┩
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue