mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-04 12:07:34 +00:00
chore: use empty SecretStr values as default
Better than using SecretStr | None so we centralize the null handling. Signed-off-by: Sébastien Han <seb@redhat.com>
This commit is contained in:
parent
c4cb6aa8d9
commit
4af141292f
51 changed files with 103 additions and 93 deletions
|
@ -16,7 +16,7 @@ Remote vLLM inference provider for connecting to vLLM servers.
|
|||
|-------|------|----------|---------|-------------|
|
||||
| `url` | `str \| None` | No | | The URL for the vLLM model serving endpoint |
|
||||
| `max_tokens` | `<class 'int'>` | No | 4096 | Maximum number of tokens to generate. |
|
||||
| `api_token` | `pydantic.types.SecretStr \| None` | No | ********** | The API token |
|
||||
| `api_token` | `<class 'pydantic.types.SecretStr'>` | No | | The API token |
|
||||
| `tls_verify` | `bool \| str` | No | True | Whether to verify TLS certificates. Can be a boolean or a path to a CA certificate file. |
|
||||
| `refresh_models` | `<class 'bool'>` | No | False | Whether to refresh models periodically |
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue