llama-stack-mirror/docs/source/providers/inference/remote_ollama.md
2025-07-24 16:16:21 -04:00

500 B

orphan
true

remote::ollama

Description

Ollama inference provider for running local models through the Ollama runtime.

Configuration

Field Type Required Default Description
url <class 'str'> No http://localhost:11434
refresh_models <class 'bool'> No False Whether to refresh models periodically

Sample Configuration

url: ${env.OLLAMA_URL:=http://localhost:11434}