llama-stack-mirror/docs/source/providers/inference/remote_ollama.md
2025-07-23 10:26:13 -04:00

604 B

orphan
true

remote::ollama

Description

Ollama inference provider for running local models through the Ollama runtime.

Configuration

Field Type Required Default Description
url <class 'str'> No http://localhost:11434
refresh_models <class 'bool'> No False refresh and re-register models periodically
refresh_models_interval <class 'int'> No 300 interval in seconds to refresh models

Sample Configuration

url: ${env.OLLAMA_URL:=http://localhost:11434}