added todo

This commit is contained in:
Justin Lee 2024-11-01 13:55:30 -07:00
parent b41abff4fb
commit 43289c36e1

View file

@ -3,9 +3,12 @@
This guide provides a streamlined setup to switch between local and cloud clients for text generation with Llama Stacks `chat_completion` API. This setup enables automatic fallback to a cloud instance if the local client is unavailable.
### Pre-requisite
Before you begin, please ensure Llama Stack is installed and the distribution are set up by following the [Getting Started Guide](https://llama-stack.readthedocs.io/en/latest/). You will need to run two distribution, a local and a cloud distribution, for this demo to work.
<!--- [TODO: show how to create two distributions] --->
### Implementation
1. **Set Up Local and Cloud Clients**