mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-31 16:01:46 +00:00
fixed based on ashwin comments
This commit is contained in:
parent
c79c8367b7
commit
022f20e710
2 changed files with 3 additions and 3 deletions
|
@ -17,7 +17,7 @@
|
|||
"\n",
|
||||
"This guide provides a streamlined setup to switch between local and cloud clients for text generation with Llama Stack’s `chat_completion` API. This setup enables automatic fallback to a cloud instance if the local client is unavailable.\n",
|
||||
"\n",
|
||||
"### Pre-requisite\n",
|
||||
"### Prerequisites\n",
|
||||
"Before you begin, please ensure Llama Stack is installed and the distribution is set up by following the [Getting Started Guide](https://llama-stack.readthedocs.io/en/latest/). You will need to run two distributions, a local and a cloud distribution, for this demo to work.\n",
|
||||
"\n",
|
||||
"### Implementation"
|
||||
|
|
|
@ -5,7 +5,7 @@ This guide will walk you through setting up an end-to-end workflow with Llama St
|
|||
If you're looking for more specific topics like tool calling or agent setup, we have a [Zero to Hero Guide](#next-steps) that covers everything from Tool Calling to Agents in detail. Feel free to skip to the end to explore the advanced topics you're interested in.
|
||||
|
||||
## Table of Contents
|
||||
1. [Setting up](#Setting-up)
|
||||
1. [Setup](#Setup)
|
||||
2. [Build, Configure, and Run Llama Stack](#build-configure-and-run-llama-stack)
|
||||
3. [Testing with `curl`](#testing-with-curl)
|
||||
4. [Testing with Python](#testing-with-python)
|
||||
|
@ -15,7 +15,7 @@ If you're looking for more specific topics like tool calling or agent setup, we
|
|||
|
||||
|
||||
|
||||
## Setting up
|
||||
## Setup
|
||||
|
||||
### 1. Prerequisite
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue