mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-29 15:23:51 +00:00
added todo
This commit is contained in:
parent
b41abff4fb
commit
43289c36e1
1 changed files with 3 additions and 0 deletions
|
@ -3,9 +3,12 @@
|
|||
|
||||
This guide provides a streamlined setup to switch between local and cloud clients for text generation with Llama Stack’s `chat_completion` API. This setup enables automatic fallback to a cloud instance if the local client is unavailable.
|
||||
|
||||
|
||||
### Pre-requisite
|
||||
Before you begin, please ensure Llama Stack is installed and the distribution are set up by following the [Getting Started Guide](https://llama-stack.readthedocs.io/en/latest/). You will need to run two distribution, a local and a cloud distribution, for this demo to work.
|
||||
|
||||
<!--- [TODO: show how to create two distributions] --->
|
||||
|
||||
### Implementation
|
||||
|
||||
1. **Set Up Local and Cloud Clients**
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue