From 05552cbfbbc36c97ab2df9a71d6fda6e9dab6048 Mon Sep 17 00:00:00 2001 From: Aidan Do Date: Mon, 2 Dec 2024 21:11:00 +1100 Subject: [PATCH] Fix broken Ollama link --- docs/source/distributions/self_hosted_distro/ollama.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/source/distributions/self_hosted_distro/ollama.md b/docs/source/distributions/self_hosted_distro/ollama.md index 0eb245483..9f81d9329 100644 --- a/docs/source/distributions/self_hosted_distro/ollama.md +++ b/docs/source/distributions/self_hosted_distro/ollama.md @@ -118,9 +118,9 @@ llama stack run ./run-with-safety.yaml \ ### (Optional) Update Model Serving Configuration -> [!NOTE] -> Please check the [OLLAMA_SUPPORTED_MODELS](https://github.com/meta-llama/llama-stack/blob/main/llama_stack/providers.remote/inference/ollama/ollama.py) for the supported Ollama models. - +```{note} +Please check the [model_aliases](https://github.com/meta-llama/llama-stack/blob/main/llama_stack/providers/remote/inference/ollama/ollama.py#L45) variable for supported Ollama models. +``` To serve a new model with `ollama` ```bash