From 43289c36e1babf2f13c9eb871a26a4f8c11d7f20 Mon Sep 17 00:00:00 2001 From: Justin Lee Date: Fri, 1 Nov 2024 13:55:30 -0700 Subject: [PATCH] added todo --- docs/source/chat_local_cloud_guide.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/docs/source/chat_local_cloud_guide.md b/docs/source/chat_local_cloud_guide.md index ea2617ecc..66dbe3297 100644 --- a/docs/source/chat_local_cloud_guide.md +++ b/docs/source/chat_local_cloud_guide.md @@ -3,9 +3,12 @@ This guide provides a streamlined setup to switch between local and cloud clients for text generation with Llama Stack’s `chat_completion` API. This setup enables automatic fallback to a cloud instance if the local client is unavailable. + ### Pre-requisite Before you begin, please ensure Llama Stack is installed and the distribution are set up by following the [Getting Started Guide](https://llama-stack.readthedocs.io/en/latest/). You will need to run two distribution, a local and a cloud distribution, for this demo to work. + + ### Implementation 1. **Set Up Local and Cloud Clients**