llama-stack-mirror/docs/source/providers/inference/remote_ollama.md
Francisco Javier Arceo c8d41d45ec chore: Enabling Milvus for VectorIO CI
Signed-off-by: Francisco Javier Arceo <farceo@redhat.com>
2025-06-30 11:55:49 -04:00

476 B

remote::ollama

Description

Ollama inference provider for running local models through the Ollama runtime.

Configuration

Field Type Required Default Description
url <class 'str'> No http://localhost:11434
raise_on_connect_error <class 'bool'> No True

Sample Configuration

url: ${env.OLLAMA_URL:=http://localhost:11434}
raise_on_connect_error: true