llama-stack-mirror/docs/source/providers/inference/remote_llamacpp.md
2025-07-14 17:25:18 -07:00

520 B

remote::llamacpp

Configuration

Field Type Required Default Description
api_key str | None No The llama.cpp server API key (optional for local servers)
openai_compat_api_base <class 'str'> No http://localhost:8080 The URL for the llama.cpp server with OpenAI-compatible API

Sample Configuration

openai_compat_api_base: ${env.LLAMACPP_URL:=http://localhost:8080}
api_key: ${env.LLAMACPP_API_KEY:=}