diff --git a/docs/source/chat_local_cloud_guide.md b/docs/source/chat_local_cloud_guide.md index ea2617ecc..66dbe3297 100644 --- a/docs/source/chat_local_cloud_guide.md +++ b/docs/source/chat_local_cloud_guide.md @@ -3,9 +3,12 @@ This guide provides a streamlined setup to switch between local and cloud clients for text generation with Llama Stack’s `chat_completion` API. This setup enables automatic fallback to a cloud instance if the local client is unavailable. + ### Pre-requisite Before you begin, please ensure Llama Stack is installed and the distribution are set up by following the [Getting Started Guide](https://llama-stack.readthedocs.io/en/latest/). You will need to run two distribution, a local and a cloud distribution, for this demo to work. + + ### Implementation 1. **Set Up Local and Cloud Clients**