Reference quick start option in other playbooks

This commit is contained in:
ChristianZaccaria 2025-07-03 11:53:27 +01:00
parent 3ffa12a108
commit ff5aee807c
5 changed files with 12 additions and 4 deletions

View file

@ -8,6 +8,8 @@ environments. You can build and test using a local server first and deploy to a
In this guide, we'll walk through how to build a RAG application locally using Llama Stack with [Ollama](https://ollama.com/)
as the inference [provider](../providers/inference/index) for a Llama Model.
**💡 Notebook Version:** You can also follow this quickstart guide in a Jupyter notebook format: [quick_start.ipynb](https://github.com/meta-llama/llama-stack/blob/main/docs/quick_start.ipynb)
#### Step 1: Install and setup
1. Install [uv](https://docs.astral.sh/uv/)
2. Run inference on a Llama model with [Ollama](https://ollama.com/download)