mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-27 06:48:05 +00:00
476 B
476 B
remote::ollama
Description
Ollama inference provider for running local models through the Ollama runtime.
Configuration
| Field | Type | Required | Default | Description |
|---|---|---|---|---|
url |
<class 'str'> |
No | http://localhost:11434 | |
raise_on_connect_error |
<class 'bool'> |
No | True |
Sample Configuration
url: ${env.OLLAMA_URL:=http://localhost:11434}
raise_on_connect_error: true