forked from phoenix-oss/llama-stack-mirror
chore: update the zero_to_hero_guide doc link (#1220)
# What does this PR do?
[Provide a short summary of what this PR does and why. Link to relevant
issues if applicable.]
It should changed by
8585b95a28
,
so show `404` when click it.
[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])
## Test Plan
[Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.*]
[//]: # (## Documentation)
Signed-off-by: reidliu <reid201711@gmail.com>
Co-authored-by: reidliu <reid201711@gmail.com>
This commit is contained in:
parent
c0c7622295
commit
55eb257459
1 changed files with 1 additions and 1 deletions
|
@ -6,7 +6,7 @@ This guide will walk you through an end-to-end workflow with Llama Stack with Ol
|
|||
|
||||
If you're looking for more specific topics, we have a [Zero to Hero Guide](#next-steps) that covers everything from Tool Calling to Agents in detail. Feel free to skip to the end to explore the advanced topics you're interested in.
|
||||
|
||||
> If you'd prefer not to set up a local server, explore our notebook on [tool calling with the Together API](Tool_Calling101_Using_Together's_Llama_Stack_Server.ipynb). This notebook will show you how to leverage together.ai's Llama Stack Server API, allowing you to get started with Llama Stack without the need for a locally built and running server.
|
||||
> If you'd prefer not to set up a local server, explore our notebook on [tool calling with the Together API](Tool_Calling101_Using_Together_Llama_Stack_Server.ipynb). This notebook will show you how to leverage together.ai's Llama Stack Server API, allowing you to get started with Llama Stack without the need for a locally built and running server.
|
||||
|
||||
## Table of Contents
|
||||
1. [Setup and run ollama](#setup-ollama)
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue