forked from phoenix-oss/llama-stack-mirror
Update README.md
This commit is contained in:
parent
6a8c2ae1df
commit
5ce759adc4
1 changed files with 5 additions and 1 deletions
|
@ -82,4 +82,8 @@ $CONDA_PREFIX/bin/pip install -e .
|
|||
|
||||
## The Llama CLI
|
||||
|
||||
The `llama` CLI makes it easy to work with the Llama Stack set of tools, including installing and running Distributions, downloading models, studying model prompt formats, etc. Please see the [CLI reference](docs/cli_reference.md) for details.
|
||||
The `llama` CLI makes it easy to work with the Llama Stack set of tools, including installing and running Distributions, downloading models, studying model prompt formats, etc. Please see the [CLI reference](docs/cli_reference.md) for details. Please see the [Getting Started](docs/getting_started.md) guide for running a Llama Stack server.
|
||||
|
||||
|
||||
## Llama Stack Client SDK
|
||||
- Check out our client SDKs for connecting to Llama Stack server in your preferred language, you can choose from [python](https://github.com/meta-llama/llama-stack-client-python), [node](https://github.com/meta-llama/llama-stack-client-node), [swift](https://github.com/meta-llama/llama-stack-client-swift), and [kotlin](https://github.com/meta-llama/llama-stack-client-kotlin) programming languages to quickly build your applications.
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue