mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-25 04:32:01 +00:00
fixed comments
This commit is contained in:
parent
b9269a94b9
commit
d5034ed759
4 changed files with 3 additions and 4 deletions
|
|
@ -16,7 +16,7 @@ Remote vLLM inference provider for connecting to vLLM servers.
|
|||
## Sample Configuration
|
||||
|
||||
```yaml
|
||||
url: ${env.VLLM_URL:=http://localhost:8000/v1}
|
||||
url: ${env.VLLM_URL}
|
||||
max_tokens: ${env.VLLM_MAX_TOKENS:=4096}
|
||||
api_token: ${env.VLLM_API_TOKEN:=fake}
|
||||
tls_verify: ${env.VLLM_TLS_VERIFY:=true}
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue