fixed based on ashwin comments

This commit is contained in:
Justin Lee 2024-11-08 15:01:13 -08:00
parent c79c8367b7
commit 022f20e710
2 changed files with 3 additions and 3 deletions

View file

@ -17,7 +17,7 @@
"\n",
"This guide provides a streamlined setup to switch between local and cloud clients for text generation with Llama Stacks `chat_completion` API. This setup enables automatic fallback to a cloud instance if the local client is unavailable.\n",
"\n",
"### Pre-requisite\n",
"### Prerequisites\n",
"Before you begin, please ensure Llama Stack is installed and the distribution is set up by following the [Getting Started Guide](https://llama-stack.readthedocs.io/en/latest/). You will need to run two distributions, a local and a cloud distribution, for this demo to work.\n",
"\n",
"### Implementation"

View file

@ -5,7 +5,7 @@ This guide will walk you through setting up an end-to-end workflow with Llama St
If you're looking for more specific topics like tool calling or agent setup, we have a [Zero to Hero Guide](#next-steps) that covers everything from Tool Calling to Agents in detail. Feel free to skip to the end to explore the advanced topics you're interested in.
## Table of Contents
1. [Setting up](#Setting-up)
1. [Setup](#Setup)
2. [Build, Configure, and Run Llama Stack](#build-configure-and-run-llama-stack)
3. [Testing with `curl`](#testing-with-curl)
4. [Testing with Python](#testing-with-python)
@ -15,7 +15,7 @@ If you're looking for more specific topics like tool calling or agent setup, we
## Setting up
## Setup
### 1. Prerequisite