From fd90d2ae97ee73b99d8eddb75e2161d5724c4573 Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Fri, 18 Oct 2024 14:30:44 -0700 Subject: [PATCH] readme --- llama_stack/distribution/docker/ollama/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/llama_stack/distribution/docker/ollama/README.md b/llama_stack/distribution/docker/ollama/README.md index 739380214..1b1b25010 100644 --- a/llama_stack/distribution/docker/ollama/README.md +++ b/llama_stack/distribution/docker/ollama/README.md @@ -3,7 +3,7 @@ The scripts in these folders help you spin up a Llama Stack distribution with Ollama Inference provider. > [!NOTE] -> This assumes you have access to GPU to start a Ollama server with access to your GPU. Please see Ollama CPU Distribution if you wish connect to a hosted Ollama endpoint. +> This assumes you have access to GPU to start a Ollama server with access to your GPU. Please see Ollama CPU Distribution if you wish to run Ollama on CPU. ### Getting Started