llama-stack-mirror/docs/source/providers/inference/remote_llamacpp.md
2025-07-14 14:32:58 -07:00

525 B

remote::llamacpp

Configuration

Field Type Required Default Description
api_key str | None No The llama.cpp server API key (optional for local servers)
openai_compat_api_base <class 'str'> No http://localhost:8080/v1 The URL for the llama.cpp server with OpenAI-compatible API

Sample Configuration

openai_compat_api_base: ${env.LLAMACPP_URL:http://localhost:8080}/v1
api_key: ${env.LLAMACPP_API_KEY:=}